Weekly Notes 01/2025

Uma met her another set of cousins this week/end. We visited the local temple to see the elephant with her on Saturday. Uma wanted to pet the elephant :) On Monday, we went to the beach to see the sunset. Uma loves water but has yet to figure out the ocean and waves to enjoy the beach thoroughly. Uma also learned a lot of words this week. Kids learn from kids more than adults.

Uma wanted to pet an elephant
Uma wanted to pet an elephant
  • 💻 Workwise, it's a slow week. I have taken a few days off to spend time with family before returning to Bengaluru this weekend.
  • 🎦 We watched Rifle Club. I liked it. It's a home invasion/action genre movie.
  • ✍ I wrote my year-end post capturing the highlights of 2024. I created a gallery post of featured images from Weekly Notes/2024. It's a pictorial representation of the year.
  • 🕸 I got to attend the Wiki Conference Kerala 2024 like last year. It was fun meeting folks. They are generally kind and collaborative folks. Hopefully, I get to do more than just participate.
  • 🕸 Anand (my friend and a fellow Infyblogger) has been writing weekly notes for a year now. He uses it to gather what he has learned over the week.
  • 🧑‍🏫 I am also working on refreshing MAD1 and MAD2. I hope to make it more interesting than what it is now.
  • 🤖 I have been using Alphine.js and HTMX in personal projects. They go well with Flask or Bottle micro libraries. I love small libraries, so I remember them well and can debug easily. I often use an LLM to code, but knowing the libraries improves my code reviews. A good code review is necessary if the project has to be alive for a decade or more. I also have a simple RAG built by formulaic.app using my favorite stack's documentation and some tech notes. I still use Zeal Offline Documentation Browser for technical documentation, but an additional search and answer layer (RAG) using locally running LLMs would make it more useful.
  • 🤖 I use an old machine (LC230) as my daily driver. My access to computing superpower is through a network (local or internet). I use Open WebUI to chat with LLMs. It kind of gives me one interface to access the LLMs across networks. I run the WebUI locally on my machine; that way, I can access previous chats (history) even if I don't have (offline) access to LLMs. It's working well for now. Of course you can also build your RAG inside OpenWebUI.

You can read this blog using RSS Feed. But if you are the person who loves getting emails, then you can join my readers by signing up.

Join 2,251 other subscribers

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.