Engineering Journal

A place to note things down.

Using Claude Code in Kaggle competition

I used Claude Code to compete in a Kaggle Competition (From Jun 17, 2025 to Sep 16, 2025). My setup was running python scripts on my local machine. At the time, I found Claude Code produces relatively fewer errors creating and editing python scripts compared to python notebooks. It struggled to correctly edit the python content inside the ipynb file format, which led to more syntax errors and malformed code. Using Opus plan mode, this means Opus 4.1 during planning, Sonnet for the actual coding to save Opus storage for the more complicated scenarios. ...

October 11, 2025 · 6 min · wal8800

The Design of Everyday Things in Software Development

The Design of Everyday Things is a book about how humans interact with products, offering principles and frameworks that help improve user experience. What has stuck with me the most is how these principles appear not just in the end-user experience of a software product, like a web or mobile app, but throughout broader system design and the entire development life cycle. It is fascinating to see how software systems apply these ideas to improve usability for everyone involved, including developers, operators, and users. Here are some of the principles: ...

May 4, 2025 · 4 min · wal8800

Running LLM locally

I was looking to run LLM locally to develop a bot to compete in a 20 questions competition. Found llama-cpp-python (a python binding on top of llama-cpp) which enables us to run LLM with variety of hardwares. This is useful since I only have an old gaming GPU with little memory (8GB). First, we need to download a LLM model. There are huge selection of LLM models that fine tuned and quantised in hugging face model hub to run locally. The LocalLLaMa subreddit is also a great resource about the latest LLM models, their performance benchmark and actual user experience. ...

October 9, 2024 · 3 min · wal8800

Graph attention network

Paper: https://arxiv.org/pdf/1710.10903.pdf Graph attention network (GAT) is a type of graph neural network with a self attention layer during the graph updates. The attention layer learns relative importance between the incoming source nodes for the target node. As a result, when updating the target node’s value during training, the source nodes with higher importance have a bigger impact on the target node’s updated value. SourceSourceTargetTarget source node is the node where the edge is pointing from and target node is the edge is pointing to So how does it work? Let’s go through the equations in the paper with code. ...

March 10, 2024 · 8 min · wal8800

Running Big Two in the browser

For a while, I have been wanting to create a web app client for the Big Two bot I have trained. However, I didn’t want to rewrite the Big Two game logic from python to javascript. Then I heard about Pyodide, it’s a python distribution that runs on the browser. This means we can run python code in the web browser! So I hacked together a small react app that uses the existing python logic and tensorflow model. With minimal experience with web frontend, I used react-create app to quickly get started. The app also uses react pixi.js library to render some simple graphics for the card game. Here are the interesting bits found while putting together the web app. ...

August 25, 2023 · 9 min · wal8800

Writing plain code

Looking to improve the clarity of my technical writing, I read “Oxford Guide to Plain English”. The book provides different guidelines to help people to write in Plain English. It uses the definition of Plain English from Plain Language Association International. Plain English is defined as “A communication is in plain language if its wording, structure, and design are so clear that the intended audience can easily find what they need, understand what they find, and use that information.” ...

May 27, 2023 · 16 min · wal8800

Ray RLlib and OpenTelemetry

This blog post assumes the reader have basic understanding of reinforcement learning and PPO algorithm. Recommended to read: https://spinningup.openai.com/en/latest/algorithms/ppo.html Problem I’m using Ray RLlib to train an agent to play a card game. The time to train with RLlib’s PPO was taking longer compared to the self implemented PPO training when using the same number of epoch. My hunch was that there are some configuration and implementation differences between RLlib PPO training and my own implementation. I need to be able to accurately measure the time taken as I change the configuration on RLlib PPO training runs to improve the training time. ...

October 8, 2022 · 9 min · wal8800

Application Indicator in Ubuntu 20.04

At my previous job, I was using this trailer app to access and manage my github pull requests. The trailer app provides an app indicator on top of the menu bar. Clicking on the icon shows a list of pull requests along their status and the list gets automatically updated. This app is only available on MacOS. After switching to my new job, I started using Ubuntu 20.04 and I couldn’t find an equivalent application. I wanted to see what it takes to create an app indicator application on Ubuntu and maybe create one myself if it’s easy. ...

June 4, 2022 · 7 min · wal8800

Mac address spoofing in LineageOS 17.1 on Raspberry Pi 3

I wanted to change the mac address on the Raspberry Pi. After a bit of googling, I was able to change the mac address using ip link however when the device is rebooted, the changes are reverted. It appears there isn’t an easy way to change the mac address permanently so I decided to create a script to apply the changes on start up. Fortunately, the LineageOS raspberry build already came with support for running start up scripts from /system/etc/init.d/. This means I just needed to add script in there. ...

February 28, 2022 · 1 min · wal8800