10000 Dev docker image by indy-3rdman · Pull Request #708 · dotnet/spark · GitHub
[go: up one dir, main page]

Skip to content

Dev docker image #708

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
updated for dotnet-spark version 1.0.0
  • Loading branch information
indy committed Oct 21, 2020
commit 0366ec580ca14e773c27fac72a335027ccb2e369
17 changes: 12 additions & 5 deletions docker/images/dev/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
ARG SDK_IMAGE_TAG=3.1-bionic
FROM mcr.microsoft.com/dotnet/core/sdk:$SDK_IMAGE_TAG

ARG SPARK_VERSION=2.4.6
ARG SPARK_VERSION=3.0.1
ARG MAVEN_VERSION=3.6.3
ARG HADOOP_VERSION=2.7
ARG DEBIAN_FRONTEND=noninteractive
Expand All @@ -13,13 +13,20 @@ ENV M2_HOME=/usr/local/bin/maven/current
ENV PATH="${PATH}:${SPARK_HOME}/bin:${M2_HOME}/bin"

RUN apt-get update \
&& apt-get install -y dialog apt-utils wget ca-certificates openjdk-8-jdk bash software-properties-common supervisor unzip socat net-tools vim \
&& apt-get install -y --no-install-recommends \
apt-utils \
ca-certificates \
dialog \
openjdk-8-jdk \
socat \
software-properties-common \
supervisor \
unzip \
wget \
&& add-apt-repository universe \
&& apt-get install -y apt-transport-https \
&& apt-get update \
&& apt-get autoremove -y --purge \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
&& apt-get clean && rm -rf /var/lib/apt/lists/*

RUN mkdir -p /usr/local/bin/maven \
&& cd /usr/local/bin/maven \
Expand Down
6 changes: 2 additions & 4 deletions docker/images/dev/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ Using this image, you can compile .NET for Apache Spark yourself.

If you do not want to build those images yourself, you can get our pre-built images directly from docker hub at [https://hub.docker.com/r/3rdman/dotnet-spark](https://hub.docker.com/r/3rdman/dotnet-spark).


## Building

To build a dev image, just run the [build.sh](build.sh) bash script. The default Apache Spark and Maven versions used to build the image are defined in the script.
Expand All @@ -27,13 +26,12 @@ build.sh -h

Please note, that not all version combinations are supported, however.


## Docker Run Example

As mentioned earlier, the dotnet-spark runtime image can be used in multiple ways. Below are some examples that might be useful.

```bash
docker run --name dotnet-spark-dev -d mcr.microsoft.com/dotnet-spark:dev-latest
docker run --name dotnet-spark-dev -d 3rdman/dotnet-spark:dev-latest
```

## Using the image to build from source
Expand Down Expand Up @@ -64,7 +62,7 @@ The image comes with code-server installed, which allows you run Visual Studio C
First, start a container from the dev image and map the code-server port to a host port that is reachable via the loopback address only.

```bash
docker run --name dotnet-spark-dev -d -p 127.0.0.1:8888:8080 mcr.microsoft.com/dotnet-spark:dev-latest
docker run --name dotnet-spark-dev -d -p 127.0.0.1:8888:8080 3rdman/dotnet-spark:dev-latest
```

![launch](img/dotnet-dev-docker-code-server-launch.gif)
Expand Down
10 changes: 7 additions & 3 deletions docker/images/dev/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,18 @@ set -o errexit # abort on nonzero exitstatus
set -o nounset # abort on unbound variable
set -o pipefail # don't hide errors within pipes

readonly image_repository='mcr.microsoft.com'
readonly supported_apache_spark_versions=("2.3.3" "2.3.4" "2.4.0" "2.4.1" "2.4.3" "2.4.4" "2.4.5" "2.4.6")
readonly image_repository='3rdman'
readonly supported_apache_spark_versions=(
"2.3.0" "2.3.1" "2.3.2" "2.3.3" "2.3.4"
"2.4.0" "2.4.1" "2.4.3" "2.4.4" "2.4.5" "2.4.6" "2.4.7"
"3.0.0" "3.0.1"
)
readonly supported_maven_versions=("3.6.3")
readonly hadoop_version=2.7
readonly sdk_image_tag="3.1-bionic"

maven_version=3.6.3
apache_spark_version=2.4.6
apache_spark_version=3.0.1

main() {
# Parse the options an set the related variables
Expand Down
0