Unfolding the universe of possibilities..

Navigating the waves of the web ocean

Running Local LLMs and VLMs on the Raspberry Pi

Get models like Phi-2, Mistral, and LLaVA running locally on a Raspberry Pi with Ollama Host LLMs and VLMs using Ollama on the Raspberry Pi — Source: Author Ever thought of running your own large language models (LLMs) or vision language models (VLMs) on

MLBasics — Simple Linear Regression

Demystifying Machine Learning Algorithms with The Simplicity of Linear Regression Continue reading on Towards Data Science »

Unlocking Data from Graphs: How to Digitise Plots and Figures with WebPlotDigitizer

Unlocking Digital Potential from Static Image Data Continue reading on Towards Data Science »

Evaluating Large Language Models

How do you know how good your LLM is? A complete guide. Continue reading on Towards Data Science »

5 Steps to Build Beautiful Stacked Area Charts with Python

How to use the full capabilities of Matplotlib to tell a more compelling story Continue reading on Towards Data Science »