<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Tools on MCP Toolbox for Databases</title><link>/integrations/serverless-spark/tools/</link><description>Recent content in Tools on MCP Toolbox for Databases</description><generator>Hugo</generator><language>en</language><atom:link href="/integrations/serverless-spark/tools/index.xml" rel="self" type="application/rss+xml"/><item><title>serverless-spark-get-batch</title><link>/integrations/serverless-spark/tools/serverless-spark-get-batch/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-get-batch/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>The &lt;code>serverless-spark-get-batch&lt;/code> tool allows you to retrieve a specific
Serverless Spark batch job.&lt;/p>
&lt;p>&lt;code>serverless-spark-list-batches&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>name&lt;/code>&lt;/strong>: The short name of the batch, e.g. for
&lt;code>projects/my-project/locations/us-central1/my-batch&lt;/code>, pass &lt;code>my-batch&lt;/code>.&lt;/li>
&lt;/ul>
&lt;p>The tool gets the &lt;code>project&lt;/code> and &lt;code>location&lt;/code> from the source configuration.&lt;/p>
&lt;h2 id="compatible-sources">Compatible Sources&lt;/h2>



&lt;div class="compatibility-section">
 &lt;p>This tool can be used with the following database sources:&lt;/p>

 &lt;table>
 &lt;thead>
 &lt;tr>
 &lt;th>Source Name&lt;/th>
 &lt;/tr>
 &lt;/thead>
 &lt;tbody>
 
 
 &lt;tr>
 &lt;td>&lt;a href="/integrations/serverless-spark/">Serverless for Apache Spark&lt;/a>&lt;/td>
 &lt;/tr>
 

 
 
 
 &lt;/tbody>
 &lt;/table>
&lt;/div>
&lt;h2 id="example">Example&lt;/h2>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="line">&lt;span class="cl">&lt;span class="nt">kind&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">tool&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">get_my_batch&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">type&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">serverless-spark-get-batch&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">source&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">my-serverless-spark-source&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">description&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">Use this tool to get a serverless spark batch.&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="output-format">Output Format&lt;/h2>
&lt;p>The response contains the full Batch object as defined in the &lt;a href="https://cloud.google.com/dataproc-serverless/docs/reference/rest/v1/projects.locations.batches#Batch">API
spec&lt;/a>,
plus additional fields &lt;code>consoleUrl&lt;/code> and &lt;code>logsUrl&lt;/code> where a human can go for more
detailed information.&lt;/p></description></item><item><title>serverless-spark-get-session-template</title><link>/integrations/serverless-spark/tools/serverless-spark-get-session-template/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-get-session-template/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>A &lt;code>serverless-spark-get-session-template&lt;/code> tool retrieves a specific Spark session template from a
Google Cloud Serverless for Apache Spark source.&lt;/p>
&lt;p>&lt;code>serverless-spark-get-session-template&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>name&lt;/code>&lt;/strong> (required): The short name of the session template, e.g. for &lt;code>projects/my-project/locations/us-central1/sessionTemplates/my-session-template&lt;/code>, pass &lt;code>my-session-template&lt;/code>.&lt;/li>
&lt;/ul>
&lt;p>The tool gets the &lt;code>project&lt;/code> and &lt;code>location&lt;/code> from the source configuration.&lt;/p>
&lt;h2 id="compatible-sources">Compatible Sources&lt;/h2>



&lt;div class="compatibility-section">
 &lt;p>This tool can be used with the following database sources:&lt;/p>

 &lt;table>
 &lt;thead>
 &lt;tr>
 &lt;th>Source Name&lt;/th>
 &lt;/tr>
 &lt;/thead>
 &lt;tbody>
 
 
 &lt;tr>
 &lt;td>&lt;a href="/integrations/serverless-spark/">Serverless for Apache Spark&lt;/a>&lt;/td>
 &lt;/tr>
 

 
 
 
 &lt;/tbody>
 &lt;/table>
&lt;/div>
&lt;h2 id="example">Example&lt;/h2>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="line">&lt;span class="cl">&lt;span class="nt">kind&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">tool&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">get_spark_session_template&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">type&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">serverless-spark-get-session-template&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">source&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">my-serverless-spark-source&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="w">&lt;/span>&lt;span class="nt">description&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="l">Use this tool to get details of a serverless spark session template.&lt;/span>&lt;span class="w">
&lt;/span>&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="output-format">Output Format&lt;/h2>
&lt;div class="highlight">&lt;pre tabindex="0" class="chroma">&lt;code class="language-json" data-lang="json">&lt;span class="line">&lt;span class="cl">&lt;span class="p">{&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="nt">&amp;#34;sessionTemplate&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="p">{&lt;/span> 
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="nt">&amp;#34;name&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;projects/my-project/locations/us-central1/sessionTemplates/my-session-template&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="nt">&amp;#34;description&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;Template for Spark Session&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl"> &lt;span class="c1">// ... complete session template resource definition
&lt;/span>&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="c1">&lt;/span> &lt;span class="p">}&lt;/span>
&lt;/span>&lt;/span>&lt;span class="line">&lt;span class="cl">&lt;span class="p">}&lt;/span>
&lt;/span>&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="reference">Reference&lt;/h2>
&lt;table>
 &lt;thead>
 &lt;tr>
 &lt;th>&lt;strong>field&lt;/strong>&lt;/th>
 &lt;th style="text-align: center">&lt;strong>type&lt;/strong>&lt;/th>
 &lt;th style="text-align: center">&lt;strong>required&lt;/strong>&lt;/th>
 &lt;th>&lt;strong>description&lt;/strong>&lt;/th>
 &lt;/tr>
 &lt;/thead>
 &lt;tbody>
 &lt;tr>
 &lt;td>type&lt;/td>
 &lt;td style="text-align: center">string&lt;/td>
 &lt;td style="text-align: center">true&lt;/td>
 &lt;td>Must be &amp;ldquo;serverless-spark-get-session-template&amp;rdquo;.&lt;/td>
 &lt;/tr>
 &lt;tr>
 &lt;td>source&lt;/td>
 &lt;td style="text-align: center">string&lt;/td>
 &lt;td style="text-align: center">true&lt;/td>
 &lt;td>Name of the source the tool should use.&lt;/td>
 &lt;/tr>
 &lt;tr>
 &lt;td>description&lt;/td>
 &lt;td style="text-align: center">string&lt;/td>
 &lt;td style="text-align: center">true&lt;/td>
 &lt;td>Description of the tool that is passed to the LLM.&lt;/td>
 &lt;/tr>
 &lt;tr>
 &lt;td>authRequired&lt;/td>
 &lt;td style="text-align: center">string[]&lt;/td>
 &lt;td style="text-align: center">false&lt;/td>
 &lt;td>List of auth services required to invoke this tool&lt;/td>
 &lt;/tr>
 &lt;/tbody>
&lt;/table></description></item><item><title>serverless-spark-list-batches</title><link>/integrations/serverless-spark/tools/serverless-spark-list-batches/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-list-batches/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>A &lt;code>serverless-spark-list-batches&lt;/code> tool returns a list of Spark batches from a
Google Cloud Serverless for Apache Spark source.&lt;/p>
&lt;p>&lt;code>serverless-spark-list-batches&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>filter&lt;/code>&lt;/strong> (optional): A filter expression to limit the batches returned.
Filters are case sensitive and may contain multiple clauses combined with
logical operators (AND/OR). Supported fields are &lt;code>batch_id&lt;/code>, &lt;code>batch_uuid&lt;/code>,
&lt;code>state&lt;/code>, &lt;code>create_time&lt;/code>, and &lt;code>labels&lt;/code>. For example: &lt;code>state = RUNNING AND create_time &amp;lt; &amp;quot;2023-01-01T00:00:00Z&amp;quot;&lt;/code>.&lt;/li>
&lt;li>&lt;strong>&lt;code>pageSize&lt;/code>&lt;/strong> (optional): The maximum number of batches to return in a single
page.&lt;/li>
&lt;li>&lt;strong>&lt;code>pageToken&lt;/code>&lt;/strong> (optional): A page token, received from a previous call, to
retrieve the next page of results.&lt;/li>
&lt;/ul>
&lt;p>The tool gets the &lt;code>project&lt;/code> and &lt;code>location&lt;/code> from the source configuration.&lt;/p></description></item><item><title>serverless-spark-cancel-batch</title><link>/integrations/serverless-spark/tools/serverless-spark-cancel-batch/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-cancel-batch/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>&lt;code>serverless-spark-cancel-batch&lt;/code> tool cancels a running Spark batch operation in
a Google Cloud Serverless for Apache Spark source. The cancellation request is
asynchronous, so the batch state will not change immediately after the tool
returns; it can take a minute or so for the cancellation to be reflected.&lt;/p>
&lt;p>&lt;code>serverless-spark-cancel-batch&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>operation&lt;/code>&lt;/strong> (required): The name of the operation to cancel. For example,
for &lt;code>projects/my-project/locations/us-central1/operations/my-operation&lt;/code>, you
would pass &lt;code>my-operation&lt;/code>.&lt;/li>
&lt;/ul>
&lt;p>The tool inherits the &lt;code>project&lt;/code> and &lt;code>location&lt;/code> from the source configuration.&lt;/p></description></item><item><title>serverless-spark-create-pyspark-batch</title><link>/integrations/serverless-spark/tools/serverless-spark-create-pyspark-batch/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-create-pyspark-batch/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>A &lt;code>serverless-spark-create-pyspark-batch&lt;/code> tool submits a Spark batch to a Google
Cloud Serverless for Apache Spark source. The workload executes asynchronously
and takes around a minute to begin executing; status can be polled using the
&lt;a href="/integrations/serverless-spark/tools/serverless-spark-get-batch/">get batch&lt;/a> tool.&lt;/p>
&lt;p>&lt;code>serverless-spark-create-pyspark-batch&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>mainFile&lt;/code>&lt;/strong>: The path to the main Python file, as a gs://&amp;hellip; URI.&lt;/li>
&lt;li>&lt;strong>&lt;code>args&lt;/code>&lt;/strong> Optional. A list of arguments passed to the main file.&lt;/li>
&lt;li>&lt;strong>&lt;code>version&lt;/code>&lt;/strong> Optional. The Serverless &lt;a href="https://docs.cloud.google.com/dataproc-serverless/docs/concepts/versions/dataproc-serverless-versions">runtime
version&lt;/a>
to execute with.&lt;/li>
&lt;/ul>
&lt;h2 id="compatible-sources">Compatible Sources&lt;/h2>



&lt;div class="compatibility-section">
 &lt;p>This tool can be used with the following database sources:&lt;/p></description></item><item><title>serverless-spark-create-spark-batch</title><link>/integrations/serverless-spark/tools/serverless-spark-create-spark-batch/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>/integrations/serverless-spark/tools/serverless-spark-create-spark-batch/</guid><description>&lt;h2 id="about">About&lt;/h2>
&lt;p>A &lt;code>serverless-spark-create-spark-batch&lt;/code> tool submits a Java Spark batch to a
Google Cloud Serverless for Apache Spark source. The workload executes
asynchronously and takes around a minute to begin executing; status can be
polled using the &lt;a href="/integrations/serverless-spark/tools/serverless-spark-get-batch/">get batch&lt;/a> tool.&lt;/p>
&lt;p>&lt;code>serverless-spark-create-spark-batch&lt;/code> accepts the following parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>&lt;code>mainJarFile&lt;/code>&lt;/strong>: Optional. The gs:// URI of the jar file that contains the
main class. Exactly one of mainJarFile or mainClass must be specified.&lt;/li>
&lt;li>&lt;strong>&lt;code>mainClass&lt;/code>&lt;/strong>: Optional. The name of the driver&amp;rsquo;s main class. Exactly one of
mainJarFile or mainClass must be specified.&lt;/li>
&lt;li>&lt;strong>&lt;code>jarFiles&lt;/code>&lt;/strong>: Optional. A list of gs:// URIs of jar files to add to the CLASSPATHs of
the Spark driver and tasks.&lt;/li>
&lt;li>&lt;strong>&lt;code>args&lt;/code>&lt;/strong> Optional. A list of arguments passed to the driver.&lt;/li>
&lt;li>&lt;strong>&lt;code>version&lt;/code>&lt;/strong> Optional. The Serverless &lt;a href="https://docs.cloud.google.com/dataproc-serverless/docs/concepts/versions/dataproc-serverless-versions">runtime
version&lt;/a>
to execute with.&lt;/li>
&lt;/ul>
&lt;h2 id="compatible-sources">Compatible Sources&lt;/h2>



&lt;div class="compatibility-section">
 &lt;p>This tool can be used with the following database sources:&lt;/p></description></item></channel></rss>