Weekly Notes 09/2025

My neck is doing okay. Resting, i.e., not looking at the phone but sleeping, has greatly helped. I'm still wearing the cervical collar, as the doctor advised. I've also slowed down on the work front this week because of it. Hopefully, I will be able to catch up next week.

Pathu finds the perfect place everywhere
  • We went out for lunch with friends at Kuuraku, Bangalore. I liked the starters, though I thought they were expensive.
  • I downloaded all my Kindle books. I had stopped buying books on Kindle a long time ago. Now, I am kind of completely out of that ecosystem. I am still looking for a decent portable e-ink device. Ideally, a 6-inch one that runs Android so that I can configure it however I want. Something Boox Palma, or maybe fix my old Kindle keyboard and jailbreak it.
  • I got tagged by Sathya and, as a response, wrote - A challenge of blog questions.
  • It's good to see people discussing air quality and AQI because of Bryan Johnson. I don't know why people have taken so long to understand air pollution is killing us. There have been some folks who have been raising concerns for more than a decade now. It's going to hurt the middle and lower middle class like any environmental disaster.
  • A cartoon can get a century-old news magazine - Vikatan's website blocked by the government without notice. We have a long way to go as a democracy.
  • Appa is here; he has a bit of a fever due to travel.
  • I updated the bengaluru_airport_estimated_wait_time scraper to get the correct data. They had updated their backend to include origin validation. I had not looked at the status of the scraper for a long time. It failed sometime in 2023, so we lost the complete 2024 data. Anyway, it's up, and data is being collected now. The data format has also changed; now that we have T2 and T1 is all domestic. The readme and the load.sh needs an update to handle the new format data. I will do it once I have collected some data. Reminder: Set up status alerts for all my public scraping jobs.
  • I have been using qwen2.5:0.5b, the smallest qwen2.5 model, locally using Ollama and Open WebUI. It runs pretty well, even on my CPU-only machine. Currently, I am using it to summarize the documents, ask questions about them, etc. I am very impressed. I want to try qwen2.5-coder:0.5b next to see what it can do. I am fascinated by small (i.e., less than 1 GB) large language models 😄 I want to push them to their limits and see everything they can do.

You can read this blog using RSS Feed. But if you are the person who loves getting emails, then you can join my readers by signing up.

Join 2,252 other subscribers

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.