I build data pipelines and AI workflows that run in production. The panel on the right is fetching live right now.▊
Hover any node. Click to expand.
// every node is real. every edge is something I've actually shipped.
Live German grid data from ENTSO-E and SMARD, transformed through a Python pipeline and surfaced in Streamlit. The same data layer that powers energy-tech companies like 1KOMMA5°.
Brave Search → Gemini → Google Docs & Sheets. Fourteen nodes: deduplication, validation branching, loop handling, automated document generation. Running in production for a real client.
Fireflies captures the meeting. OpenAI structures the transcript. Outputs route to Todoist, Calendar, and Notion automatically. The meeting ends and the work is already organized.
Always-on chief-of-staff agent. Classifies inbox into 9 labels, auto-drafts tone-matched replies, fires SMS meeting prep 15 minutes before external calls, records and summarizes every meeting, and delivers a 7am daily brief. Zero manual effort.
I'm a Business Informatics student at TH Wildau and a freelance Automation & AI Developer. I build production-grade data pipelines and AI workflows for clients — the kind of systems where the output ends up in someone's revenue, not someone's slide deck.
I'm GDPR-aware, comfortable with the responsibility of moving real business data, and fast at picking up the tools a team already uses.
I'm looking for a Werkstudent role in data engineering or BI at a Berlin energy-tech or SaaS company — somewhere I can contribute from week one and grow into a full role.
// so is my portfolio.