A preliminary exploration of the use of Docker (2): the combination of Docker and .NET Core
Posted Jun 16, 2020 • 5 min read
In the combination of the two, Microsoft officials have given great support. From the official articles and the Docker options that come with VS 2017 when building the .NET Core project, it can be seen that this is also related to the cross-platform feature of Core There is a great relationship, and Docker can choose to deploy in Linux or Windows environment
The last article introduced how to pull Core's official image, but we will eventually apply Docker to our development environment. How can we generate our own project as an image and deploy it to Docker? The first step is to add the Dockerfile file. There are roughly three methods in VS2017. They are slightly different, but in the end, a Dockerfile file is also created.
1. Added when creating the project
When creating a new Core project, check the "Enable Docker support" option, the newly created project will automatically add a dockerfile file, the specific content of the file is analyzed below
2. Add manually
Projects already in use can be "right click-add-Docker support", so that you can also create a new Dockerfile file
3. Container business process coordination control program support
This method is special compared to the previous two. It is no longer just adding a Dockerfile file, but as the name is generally an entire production chain, used to cooperate with the continuous integration tool development-debugging-generation-release one-stop service . The way to add is the same as the second one, you can see the name of the mouth by right-clicking the item and adding it
In this way, in addition to generating a Dockerfile file, a business process coordination program called Docker Compose will be added to the solution. You can choose it when you create a new one, but this is the only one that comes with the default. It contains two files, one is
.dockerignore which is similar to git, the file recorded in it will not be packaged as a mirror and released, and the other is
docker-compose.yml, which is used to configure the business process information, such as Image name and path of Dockerfile
There are probably the following, pick a few to talk about
FROM MAINTAINER RUN CMD EXPOSE ENV ADD COPY ENTRYPOINT VOLUME USER WORKDIR ONBUILD
Explain the image used. If there is no local image, the image with the corresponding name will be automatically pulled. If no label is specified, the default is latest.
FORM instruction is the first line of the Dockerfile file, but it may not be unique, as many as necessary
Taking the default Core project as an example, the official image of Core is pulled here. The last article is also useful, namely the runtime and SDK
WORKDIR <working directory path>
Obviously, the path where the image is installed. If the path does not exist, Docker will automatically create it
COPY <source path> <target path>
Copy files and directories to the container's file system. The files and directories need to be in a path relative to the Dockerfile.
The command to be executed on the current image can be in shell or exec format
Local port mapped externally by the server container
The format is the same as
RUN, but this command is executed after the container is started and will not be overwritten by the
RUN command. There can only be one in a Dockerfile, if there are multiple, only the last one will be executed
The rest of the instructions have not been used much, and the usage is not yet clear
Docker project debugging
We mentioned three ways to create a Dockerfile file in the front, in fact, there are two cases, for these two cases, the method of packaging the image is also different
Ways to add files only
docker build -t <name> <path> instruction. This situation is more general, whether it is created with VS or not, you can use this instruction to package the image
Enter the path where the Dockerfile file is located and execute the command
For demonstration, I first deleted the official sdk of core. Since we used the
FROM command in the Dockerfile, after executing the command, we found that Docker automatically downloaded the image and packaged our project, but there was a sentence at the end. image operating system "windows" cannot be used on this platform, because my Docker uses Linux mode, here we build a Windows container image, so you need to switch, right click the tray icon "switch to. ..",
In order to reflect the effect of the Dockerfile instruction, we delete the previously created image and execute the Build instruction again. This time Docker did not download the Core SDK because it has been downloaded before. However, I encountered a network problem here and found that the previous image pull failed, so I changed a mirror acceleration address, then simplified the dockerfile file, and then repeated the above operation.
FROM microsoft/dotnet:2.1-aspnetcore-runtime WORKDIR /app COPY.. EXPOSE 80 ENTRYPOINT ["dotnet", "CoreDockerDemo1.dll"]
You can see that the instructions in the dockerfile are executed sequentially. After completion, we can use the
docker image ls to see the image we built, and then use the above method to create Docker.
Container business process coordination control program support
You don't need to manually build this way, just click the debug button in VS. Since we have added this set of coordination control program earlier, we can directly select docker for debugging in this project
Before that, you must configure the
docker-compose.yml file, which is basically similar to the dockerfile, and more intuitive, corresponding to the input name, etc.
I got the error "Volume sharing is not enabled" during the first generation. Here we need to select the disk generated by the program in the Shared Drives tab in the Docker settings, and then click the "Apply" button to apply the settings, and then docker Will automatically restart
After setting, click Run in VS, the web application will automatically compile and generate images and containers, and then start the website. The first time you start, you may be asked whether to authorize the SSL certificate.
Basically with these, you can use Docker to bring some convenience to the development work, if you want to continue to deepen later, it is to combine Docker and continuous integration to apply to the website server environment