Software Engineering

intelligent
piXel | it

George A. Rauscher  /  intelligent piXel GmbH  /  Starnberg

Systems that hold.
Infrastructure that stays calm.

There are things George Rauscher does not do. He does not drive across Germany to install software on a computer. He does not repair hardware on-site. He does not show up with a toolkit and bill by the hour for work that should have been done remotely in twenty minutes.

What he does offer, for established clients and for new ones where a serious long-term working relationship makes sense, is remote maintenance across both Mac and Windows. A well-configured Apple environment, properly hardened, is about as close to unbreakable as consumer computing gets. Most problems do not require a reinstallation. They require someone who knows exactly where to look.

Everything else is on the table.

Any API that is properly documented can be integrated. That sentence sounds simple. What it means in practice is that the accounting software talking to the payment processor talking to the dunning system talking to the Deutsche Post API for physical mail delivery, all of it running without anyone touching it, is not a fantasy.

It is a system George built for a veterinary practice because he was tired of chasing invoices. IPMahnwesen handles the complete receivables workflow autonomously: it reconciles incoming payments against Lexoffice, generates reminders and final notices, dispatches them by email or physical post, and hands the file to collections when the threshold is reached. It runs by itself. The payment culture in veterinary medicine is what it is, and the system accounts for it with a logic that has recovered every outstanding amount it was given to pursue.

That is one example. The approach is the same regardless of what the system touches. The work begins with understanding what the client actually needs, not what they asked for, and ends when the result runs the way it should run, securely, quickly, and without requiring a phone call every time something changes.

Websites are built fast and built to perform, without WordPress as a default. When a client has an existing WordPress installation, it gets secured, repaired, optimized, and where it makes sense, migrated away from entirely. WordPress plugins get built when needed. What does not get recommended is a bloated CMS as the foundation for something that needs to last and hold up under real conditions. The alternative is leaner, faster, and considerably harder to compromise.

The reality is simple: the majority of website visitors now arrive on a smartphone, while a surprising number of web offerings are still built as if mobile were an afterthought. That is not how we work. Mobile optimization is part of the job from the beginning, because performance, readability, navigation, and conversion have to hold up on the device people are actually using.

Hardware consulting for companies evaluating AI infrastructure is becoming more valuable by the month. Most vendors selling AI capable hardware are selling to clients who do not actually know what they need. George operates two NVIDIA H200 GPUs locally, so his advice comes from direct operational experience, not from a brochure. He knows the difference between what a spec sheet promises and what a real workload requires, and he can tell a company what to buy, what to skip, and why before they spend money they cannot recover.

IP Beacon V3.5, available at my.0at.de, is a professional network forensics and IP analysis platform. It performs full IP analysis, device identification, privacy scoring, WebRTC leak detection, port scanning, and network testing, with PDF export for every module.

Everything runs client-side. Nothing is stored, logged, or transmitted anywhere. It runs in the browser, runs locally, runs on any server. It is free, open source under the MIT License, and built because tools that actually tell you what is happening on your network should not require an account, a subscription, or a company that quietly logs your data in the background.

Encryptor.app is military-grade encryption built on AES-256-GCM with PBKDF2 key derivation at one hundred thousand iterations. Version 2.2 introduced client-side file encryption up to two gigabytes, processed in chunks using the Web Crypto API. The plaintext never reaches the server. There is no server-side logging, no account requirement, no backdoor, no theoretical vulnerability introduced by transit.

The source code is published on GitHub under the MIT License because encryption tools that ask you to trust them without showing you the code are not encryption tools worth trusting. Privacy is not paranoia. It is infrastructure. Everything deeper on that subject lives on the Security domain, and it is worth reading.

intelligent piXel operates its own DNS servers and maintains connections to all major registrars. Domain registration and management for companies that need their own DNS environment is a service that comes with something most registrar control panels do not offer: a dedicated client account where the company manages its own domains and DNS records directly, with the underlying infrastructure maintained and secured by people who understand what is running under it.

On hosting: dedicated servers are available, but not always the right answer. Cloud infrastructure with nightly snapshots offers a recovery profile that bare metal alone cannot match, and the ability to build genuinely fault-tolerant systems, two servers, two locations, a cluster IP that keeps the service alive when one node goes down, is a level of resilience that most small and mid-sized companies do not realize is within reach at their budget. The recommendation always depends on the actual risk tolerance and the actual workload. It is never a default.

All systems run Linux. Not as a preference but as a standard with no exceptions, because the security profile of a properly hardened Linux server operated by someone who has been breaking into systems for decades is categorically different from the alternative. SentinelLX, close to publication and accompanied by a peer-reviewed scientific paper, takes that further: a server security architecture built on the principle that a system should be capable of detecting, isolating, and responding to intrusion from within, without waiting for an external signal that may arrive too late. The paper documents the methodology in full. It is not a product announcement. It is a contribution to how the problem gets framed.

For over twenty years, George has operated his own mail infrastructure. VIPMail is the result of that: a secure email platform running on German servers, maintained continuously, protected in real time by AI-based filtering against phishing, aggressive spam, and malware.

Nothing is read. Nothing is logged. Everything is DSGVO-compliant by architecture, not by policy statement. For companies currently running Microsoft Exchange and wondering why the administrative overhead never stops growing, there is a better path. Proton as a destination, a clean migration, and a mail environment that does what mail is supposed to do without requiring an enterprise support contract to keep it from falling over.

Microsoft Exchange does not get recommended here. The complexity of its administration, the surface area it presents to attackers, and the organizational dependency it creates are not problems that justify the features it provides. When something gets recommended at intelligent piXel, it is because it works, holds up under real conditions, and does not create more problems than it solves.

Everything in the security domain proper, server hardening, penetration testing, incident response, the full depth of that work, lives at security.intelligent-pixel.com. The line between IT and Security is a useful organizational distinction. In practice, the two have never been separate here, and the work in both reflects that.

Search engine optimization has an industry built around it, with retainers, reports, keyword rankings, and the monthly reassurance that the work is working. That industry is not going to disappear overnight. But the foundation it was built on is shifting in ways that most of its practitioners are not yet prepared to acknowledge publicly.

The way people find information is changing. Not gradually and not theoretically. It is changing right now, in the behavior of millions of people who have stopped typing queries into a search bar and started asking their AI system instead. ChatGPT, Claude, Gemini, and the systems that will follow them do not return a list of ten blue links and leave the user to figure out which one is worth clicking. They read the web, synthesize what they find, and produce an answer. The user gets the answer. The website that provided the underlying information may never receive the visit.

Backlinks, which for two decades were the central currency of search engine authority, carry a fraction of the weight they once did. Keyword density, meta descriptions optimized to the character, content written for an algorithm rather than for a human reader: these are the tools of a discipline that was built for a search ecosystem that is being replaced. The SEO providers who built their business models on these mechanics are not wrong about the past. They are selling the past in a market that has already moved.

What matters now is different and in some respects more demanding. AI systems and modern search engines read content for genuine informational depth. They evaluate structure, accuracy, and the degree to which a page actually answers the question a user is likely to bring to it. They assess technical integrity: page speed, mobile performance, crawlability, schema markup that communicates context rather than just content. They consider whether a source is cited by others, not because a link-building campaign manufactured those citations, but because the content was worth citing. The distinction between authentic authority and manufactured authority has never been harder to fake and easier to detect.

intelligent piXel builds websites and adapts existing ones for this environment. That means content architecture designed for how AI systems parse and evaluate information. It means technical foundations that perform under real conditions, not just under test conditions. It means the kind of structural overhaul that takes time and requires someone who understands both the technical layer and the content layer and how they interact, because in the current environment they cannot be treated separately.

This is not a small undertaking. The entire logic of how a website communicates with the systems that determine whether it gets found has changed, and most websites that were built or optimized for the previous era have not caught up. The gap between those that have and those that have not is already visible in the results, and it will become more visible every month as AI-driven search consolidates its position.

The honest advice: stop paying for SEO work that is optimizing for a world that no longer exists, and start building for the one that does.

George Rauscher does not take every client. That is not a positioning statement. It is how the work actually functions at the level it needs to function.

Four decades of building systems, integrating APIs, writing code that runs in production environments where failure has real consequences, advising companies on infrastructure decisions that will define their operational reliability for years, that body of experience is not something that gets distributed across a hundred simultaneous client relationships without losing what makes it valuable. The clients George works with over the long term get something that cannot be packaged into a service tier: someone who knows their systems, understands their business, and treats their infrastructure with the same attention he gives his own.

The chemistry has to be right. A client who wants a problem fixed once and never wants to hear from a technician again is not the wrong kind of client in general. They are simply not the right fit for the way this work gets done. The right fit is a company, of any size, that understands the relationship between the quality of its IT infrastructure and the quality of everything that runs on top of it, and that wants a partner who will still be there in five years, knowing exactly what was built and why, when something needs to change.

IT is not a background function. It is the circulatory system of a business. Communication depends on it. Operations depend on it. Revenue depends on it. A home office with a single person and a laptop has the same fundamental requirement as a company with fifty employees: the systems need to work, reliably, securely, and without requiring a crisis to trigger the attention they should have been receiving all along.

George has clients he has worked with for decades. Not because contracts kept them there. Because the systems hold, the advice has been honest even when it was inconvenient, and the working relationship has been worth more to both sides than a transactional exchange of services for invoices ever could have been.

That is what is on offer here. Not a ticket system. Not a response time guarantee measured in hours. A working partnership with someone who has the depth to address any problem this discipline produces and the standards to tell you what you actually need to hear.

Need IT work?

Serious infrastructure, integrations, migrations, and long-term technical stewardship start at the contact page.

Need deeper security?

Server hardening, penetration testing, and incident response live on the dedicated Security domain.