Note: I had stopped writing posts in 2017. Slowly getting back to it starting late 2024, mostly for AI.

Think about local minima in thousands of dimensions

When I first learned about Gradient Descent about two years ago, I pictured it in the most obvious 3D way - where one imagines two input variables (as x and y axis in a 2D plane) and the loss being the third (z) axis. In terms of 'local minima' I imagined it as the...

read more

BeekeeperAI

Website. Interesting application of federated learning to solve healthcare's data sharing issues. Their platform allows the secure interaction of algorithm and data - from different entities. Like an escrow. Based on Azure confidential compute. Founded in 2022 at...

read more

Three simple examples of LLM confabulations

Large Language Models (LLMs) like ChatGPT can handle two aspects of communication very well: plausibility and fluency. Given an input context they determine what are the most probable sequence of words and string them in a way that is superbly eloquent. That makes the...

read more