Skip to content

Commit ece1470

Browse files
committed
[SPARK-49428][SQL] Move Connect Scala Client from Connector to SQL
### What changes were proposed in this pull request? This PR moves the connect Scala JVM client project to sql. It also moves the connect/bin and connect/doc to sql. ### Why are the changes needed? Connect is part of the sql project now. It is weird to keep these seperate. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Existing tests. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #49695 from hvanhovell/SPARK-49428. Authored-by: Herman van Hovell <[email protected]> Signed-off-by: Herman van Hovell <[email protected]>
1 parent ecf6851 commit ece1470

File tree

69 files changed

+21
-25
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

69 files changed

+21
-25
lines changed

.github/labeler.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -223,7 +223,6 @@ CONNECT:
223223
- changed-files:
224224
- any-glob-to-any-file: [
225225
'sql/connect/**/*',
226-
'connector/connect/**/*',
227226
'python/**/connect/**/*'
228227
]
229228

.github/workflows/maven_test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -194,7 +194,7 @@ jobs:
194194
if [[ "$INCLUDED_TAGS" != "" ]]; then
195195
./build/mvn $MAVEN_CLI_OPTS -pl "$TEST_MODULES" -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Pjvm-profiler -Pspark-ganglia-lgpl -Pkinesis-asl -Djava.version=${JAVA_VERSION/-ea} -Dtest.include.tags="$INCLUDED_TAGS" test -fae
196196
elif [[ "$MODULES_TO_TEST" == "connect" ]]; then
197-
./build/mvn $MAVEN_CLI_OPTS -Dtest.exclude.tags="$EXCLUDED_TAGS" -Djava.version=${JAVA_VERSION/-ea} -pl connector/connect/client/jvm,sql/connect/common,sql/connect/server test -fae
197+
./build/mvn $MAVEN_CLI_OPTS -Dtest.exclude.tags="$EXCLUDED_TAGS" -Djava.version=${JAVA_VERSION/-ea} -pl sql/connect/client/jvm,sql/connect/common,sql/connect/server test -fae
198198
elif [[ "$EXCLUDED_TAGS" != "" ]]; then
199199
./build/mvn $MAVEN_CLI_OPTS -pl "$TEST_MODULES" -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Pjvm-profiler -Pspark-ganglia-lgpl -Pkinesis-asl -Djava.version=${JAVA_VERSION/-ea} -Dtest.exclude.tags="$EXCLUDED_TAGS" test -fae
200200
elif [[ "$MODULES_TO_TEST" == *"sql#hive-thriftserver"* ]]; then

assembly/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<executable>cp</executable>
193193
<arguments>
194194
<argument>-r</argument>
195-
<argument>${basedir}/../connector/connect/client/jvm/target/connect-repl</argument>
195+
<argument>${basedir}/../sql/connect/client/jvm/target/connect-repl</argument>
196196
<argument>${basedir}/target/scala-${scala.binary.version}/jars/</argument>
197197
</arguments>
198198
</configuration>
@@ -206,7 +206,7 @@
206206
<configuration>
207207
<executable>cp</executable>
208208
<arguments>
209-
<argument>${basedir}/../connector/connect/client/jvm/target/spark-connect-client-jvm_${scala.binary.version}-${project.version}.jar</argument>
209+
<argument>${basedir}/../sql/connect/client/jvm/target/spark-connect-client-jvm_${scala.binary.version}-${project.version}.jar</argument>
210210
<argument>${basedir}/target/scala-${scala.binary.version}/jars/connect-repl</argument>
211211
</arguments>
212212
</configuration>

dev/lint-scala

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -34,14 +34,14 @@ ERRORS=$(./build/mvn \
3434
-pl sql/api \
3535
-pl sql/connect/common \
3636
-pl sql/connect/server \
37-
-pl connector/connect/client/jvm \
37+
-pl sql/connect/client/jvm \
3838
2>&1 | grep -e "Unformatted files found" \
3939
)
4040

4141
if test ! -z "$ERRORS"; then
42-
echo -e "The scalafmt check failed on sql/connect or connector/connect at following occurrences:\n\n$ERRORS\n"
42+
echo -e "The scalafmt check failed on sql/connect or sql/connect at following occurrences:\n\n$ERRORS\n"
4343
echo "Before submitting your change, please make sure to format your code using the following command:"
44-
echo "./build/mvn scalafmt:format -Dscalafmt.skip=false -Dscalafmt.validateOnly=false -Dscalafmt.changedOnly=false -pl sql/api -pl sql/connect/common -pl sql/connect/server -pl connector/connect/client/jvm"
44+
echo "./build/mvn scalafmt:format -Dscalafmt.skip=false -Dscalafmt.validateOnly=false -Dscalafmt.changedOnly=false -pl sql/api -pl sql/connect/common -pl sql/connect/server -pl sql/connect/client/jvm"
4545
exit 1
4646
else
4747
echo -e "Scalafmt checks passed."

dev/protobuf-breaking-changes-check.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ fi
3535

3636
pushd sql/connect/common/src/main &&
3737
echo "Start protobuf breaking changes checking against $BRANCH" &&
38-
buf breaking --against "https://github.com/apache/spark.git#branch=$BRANCH,subdir=connector/connect/common/src/main" &&
38+
buf breaking --against "https://github.com/apache/spark.git#branch=$BRANCH,subdir=sql/connect/common/src/main" &&
3939
echo "Finsh protobuf breaking changes checking: SUCCESS"
4040

4141
if [[ $? -ne -0 ]]; then

dev/sparktestsupport/modules.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -334,7 +334,6 @@ def __hash__(self):
334334
dependencies=[hive, avro, protobuf],
335335
source_file_regexes=[
336336
"sql/connect",
337-
"connector/connect",
338337
],
339338
sbt_test_goals=[
340339
"connect/test",

docs/_plugins/build_api_docs.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -149,11 +149,11 @@ def build_scala_and_java_docs
149149
# Copy over the unified ScalaDoc for all projects to api/scala.
150150
# This directory will be copied over to _site when `jekyll` command is run.
151151
copy_and_update_scala_docs("../target/scala-2.13/unidoc", "api/scala")
152-
# copy_and_update_scala_docs("../connector/connect/client/jvm/target/scala-2.13/unidoc", "api/connect/scala")
152+
# copy_and_update_scala_docs("../sql/connect/client/jvm/target/scala-2.13/unidoc", "api/connect/scala")
153153

154154
# Copy over the unified JavaDoc for all projects to api/java.
155155
copy_and_update_java_docs("../target/javaunidoc", "api/java", "api/scala")
156-
# copy_and_update_java_docs("../connector/connect/client/jvm/target/javaunidoc", "api/connect/java", "api/connect/scala")
156+
# copy_and_update_java_docs("../sql/connect/client/jvm/target/javaunidoc", "api/connect/java", "api/connect/scala")
157157
end
158158

159159
def build_python_docs

docs/spark-connect-overview.md

Lines changed: 1 addition & 1 deletion

pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -97,6 +97,7 @@
9797
<module>sql/hive</module>
9898
<module>sql/connect/server</module>
9999
<module>sql/connect/common</module>
100+
<module>sql/connect/client/jvm</module>
100101
<module>assembly</module>
101102
<module>examples</module>
102103
<module>repl</module>
@@ -106,7 +107,6 @@
106107
<module>connector/kafka-0-10-assembly</module>
107108
<module>connector/kafka-0-10-sql</module>
108109
<module>connector/avro</module>
109-
<module>connector/connect/client/jvm</module>
110110
<module>connector/protobuf</module>
111111
<!-- See additional modules enabled by profiles below -->
112112
</modules>

project/SparkBuild.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1573,7 +1573,7 @@ object CopyDependencies {
15731573
Files.createDirectories(destDir)
15741574

15751575
val sourceAssemblyJar = Paths.get(
1576-
BuildCommons.sparkHome.getAbsolutePath, "connector", "connect", "client",
1576+
BuildCommons.sparkHome.getAbsolutePath, "sql", "connect", "client",
15771577
"jvm", "target", s"scala-$scalaBinaryVer", s"spark-connect-client-jvm-assembly-$sparkVer.jar")
15781578
val destAssemblyJar = Paths.get(destDir.toString, s"spark-connect-client-jvm-assembly-$sparkVer.jar")
15791579
Files.copy(sourceAssemblyJar, destAssemblyJar, StandardCopyOption.REPLACE_EXISTING)

0 commit comments

Comments
 (0)