Configuring forgejo actions

Last week I decided I wanted to try out forgejo actions to build this blog instead of using webhooks, so I looked the documentation and started playing with it until I had it working as I wanted. This post is to describe how I’ve installed and configured a forgejo runner, how I’ve added an oci organization to my instance to build, publish and mirror container images and added a couple of additional organizations (actions and docker for now) to mirror interesting actions. The changes made to build the site using actions will be documented on a separate post, as I’ll be using this entry to test the new setup on the blog project. Installing the runnerThe first thing I’ve done is to install a runner on my server, I decided to use the OCI image installation method, as it seemed to be the easiest and fastest one. The commands I’ve used to setup the runner are the following: $ cd /srv $ git clone https://forgejo.mixinet.net/blogops/forgejo-runner.git $ cd forgejo-runner $ sh ./bin/setup-runner.sh...

March 17, 2025 · 16 min · Sergio Talens-Oliag

Testing DeepSeek with Ollama and Open WebUI

With all the recent buzz about DeepSeek and its capabilities, I’ve decided to give it a try using Ollama and Open WebUI on my work laptop which has an NVIDIA GPU: $ lspci | grep NVIDIA 0000:01:00.0 3D controller: NVIDIA Corporation GA107GLM [RTX A2000 8GB Laptop GPU] (rev a1) For the installation I initially I looked into the approach suggested on this article, but after reviewing it I decided to go for a docker only approach, as it leaves my system clean and updates are easier. Step 0: Install dockerI already had it on my machine, so nothing to do here. Step 1: Install the nvidia-container-toolkit packageAs it is needed to use the NVIDIA GPU with docker I followed the instructions to install the package using apt from the NVIDIA website. Step 2: Run the Open WebUI container bundled with OllamaI could install ollama directly on linux or run it on docker, but I found out that there is a container with Open WebUI bundled with Ollama, so I decided to use it instead. To start the container I’ve executed the following command: docker run -d \ -e OLLAMA_HOST="0.0.0.0:11434" -p 127.0.0.1:11434:11434 \ -p 127.0.0.1:3000:8080 \ -v ollama:/root/.ollama \ -v open-webui:/app/backend/data \ --gpus=all --name open-webui --restart always \ ghcr.io/open-webui/open-webui:ollama...

February 3, 2025 · 6 min · Sergio Talens-Oliag