A photo of Maximilian Schwarzmüller

Maximilian Schwarzmüller

New posts, thoughts and more...

  1. My Top 5 Ways of Using Generative AI and LLMs

    My Top 5 Ways of Using Generative AI and LLMs

    You can get more done by leveraging LLMs and generative AI. But it's probably not so much about an "AI Agent Army" as it is about using AI for specific tasks. Here are some examples of how I use AI in my daily work.

    Read more

  2. AI Has A Favorite Tech Stack. That's A Problem!

    AI Has A Favorite Tech Stack. That's A Problem!

    Large Language Models (LLMs) are great at generating code, but they often default to a narrow set of technologies. This could stifle innovation and lead to outdated practices.

    Read more

  3. Why LLMs Need GPUs and VRAM

    Why LLMs Need GPUs and VRAM

    Many large language models (LLMs) can be run on your own computer. And whilst you don't need a supercomputer, having a decent GPU and enough VRAM will help a lot. Here's why.

    Read more

  4. Using Open LLMs On-Demand via Bedrock

    Using Open LLMs On-Demand via Bedrock

    Running open LLMs locally or self-hosting them is great, but it can be a hassle (AND may require significant resources). Services like Amazon Bedrock allow you to use (and pay for) open models on-demand, without the hassle of self-hosting.

    Read more

  5. The Danger of Relying Too Much on AI

    The Danger of Relying Too Much on AI

    AI coding assistants are powerful tools, and, to some extent, they can replace developers. But they are not a replacement for coding skills. In fact, they make coding skills even more important. The danger just is that we might forget that.

    Read more

  6. Mixture of Experts (MoE) vs Dense LLMs

    Mixture of Experts (MoE) vs Dense LLMs

    Mixture of Experts (MoE) LLMs promise faster inference than traditional Dense models. But the model names can be confusing. And a surprise might await when trying to run them locally.

    Read more