Diary

16th December, 2024

#####

  • LLM context window is the amount of text an LLM can take as input and formulate a response for. Prompt engineering is tweaking the text in this context window.
  • The longer the input in the context window, the higher the compute cost.
  • Neural attention memory models (NAMMs) can be used to eliminate redundant tokens and improve performance.

27th October, 2025

Big Oh notation
  • Big Oh notation isn’t measuring how fast your code runs for any given moment in time. It’s trying to estimate how slow your code will run as data grows.
  • For small enough data sizes, it isn’t worth thinking too much about algorithmic complexity. Just proceed with using the simplest approach.

29th October, 2025

Progressive Nations
  • Venezuela vs Norway - Venezuela mismanaged oil generated wealth by squandering through public spending and corruption, while Norway build a soverign fund.
  • Colonial legacies create uneven playing fields through extractive institutions, arbitraty borders, ethnic fragmentation, and weak governance structures such as centralization, few checks and balances, no free press and discouraging opposition.
  • Colonial legacies are constraints, not destinies. Countries like India, Singapore and Botswana managed to create strong economies.
  • A prosperous nation requires institutions with high public trust.