-
Notifications
You must be signed in to change notification settings - Fork 0
improvement: brand new way of installing and using Spark #54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
…exec" and "spark_run.sh driver", upload s3mdbseq_keys.txt from a standard path
0a25208
to
5693cc4
Compare
…d the endpoints TLS certs
0860d55
to
41c5822
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nothing major, just questions and some minor nitpick thoughts.
The build.sh script:
- Results in lots of warnings of unmet peer dependencies and non-deterministic behavior of steps being skipped.
- Dies on step 25/31 during Docker build
--> 0edb15cea50b [2/2] STEP 23/31: COPY conf/spark-defaults.conf ${SPARK_HOME}/conf --> d9039f8c56fe [2/2] STEP 24/31: COPY conf/spark-env.sh ${SPARK_HOME}/conf --> 54bcea75de6f [2/2] STEP 25/31: COPY aws-java-sdk-bundle-1.12.770.ja[r] /spark/jars/ Error: building at STEP "COPY aws-java-sdk-bundle-1.12.770.ja[r] /spark/jars/": checking on sources under "/home/trevorbenson/Projects/spark": Rel: can't make relative to /home/trevorbenson/Projects/spark; copier: stat: ["/aws-java-sdk-bundle-1.12.770.ja[r]"]: no such file or directory Upload /tmp/spark-image-3.5.2-12.tgz and /tmp/scality-spark-scripts-3.5.2-12.tgz to the supervisor.
Once it builds I'll approve. If you don't observe the same failure during build let me know and I'll see if it is somehow unique to my environment.
COPY --from=ghcr.io/astral-sh/uv:0.4.8 /uv /bin/uv | ||
|
||
RUN --mount=type=cache,target=/root/.cache/uv \ | ||
uv pip compile /tmp/requirements.txt > /tmp/requirements-compiled.txt \ | ||
&& uv pip sync --system /tmp/requirements-compiled.txt \ | ||
&& uv pip install --system /tmp/scality-0.1-py3-none-any.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question:
Is the legacy uv version a workaround to get pip3.8 to install the requirements file without Tracebacks?
With Python 3.8 EOL in Oct 2024, is there any reason not to take this time to bump to Python 3.9 or Python 3.11, the spark 3.5.2 max supported version?
result = calculate_sum_of_squares() | ||
|
||
# Afficher le résultat | ||
print(f"La somme des carrés est : {result}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick:
Use one consistent language for all tools, all French or all English, etc.
# Add the server's short hostname for master | ||
echo "${master} $(hostname -s) # Added by spark_run.sh" >> /etc/hosts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consistent indentation depth.
# Add the server's short hostname for master | |
echo "${master} $(hostname -s) # Added by spark_run.sh" >> /etc/hosts | |
# Add the server's short hostname for master | |
echo "${master} $(hostname -s) # Added by spark_run.sh" >> /etc/hosts |
This PR brings the following changes:
Thanks a lot @mobidyc and @scality-fno for your hard work bringing Spark to a new level! It is now time to CONVERT THE TRY!