Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Developing Against Different Spark Versions #1725

Open
james-willis opened this issue Dec 18, 2024 · 1 comment
Open

Developing Against Different Spark Versions #1725

james-willis opened this issue Dec 18, 2024 · 1 comment

Comments

@james-willis
Copy link
Contributor

Expected behavior

I wanted to be able to follow these instructions but test specific unit tests against different spark/scala versions in Intellij

Actual behavior

I havent been able to figure out how to modify the project/intellij to accomplish this.

Steps to reproduce the problem

  1. Set up IntelliJ as described in the link above
  2. change the spark.version to 3.5 in the top level POM file
  3. Observe that python 3.3 is still being used

Settings

Sedona version = HEAD of master

Apache Spark version = various

Apache Flink version = None

API type = Scala

Scala version = 2.12

JRE version = 1.8

Python version = None

Environment = IntelliJ

Copy link

Thank you for your interest in Apache Sedona! We appreciate you opening your first issue. Contributions like yours help make Apache Sedona better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant