Ostatnio na blogu:
02.06.2025r.
Keeping your infrastructure isolated and carefully splitting hosts/VMs/interfaces in to the air-gapped
security domains is crucial. It is one of the most effective techniques for securing infrastructure against
data leaks and remote takeovers.
Why is it so important?
- Eliminates entire classes of cyberattacks: attackers can’t hit what they can’t reach.
Ransomware, zero-days, and automated botnet most likely require inbound/outbound connections to work.
- No silent data theft – if malware or a malicious insider tries to steal your data,
they can’t call home without Internet access. Data stays inside your walls.
- Simplifies monitoring & forensics - in an offline network, any unexpected connection
attempt is an instant red flag (vs. noisy Internet traffic).
- Future-proofs against unknown threats.
- Real Zero Trust (without the complexity).
Local Nginx caching proxy + Harbor
Isolating your infrastructure from the Internet prevents direct access to external package repositories and
container registries. To maintain updates and deploy containers, you'll need to set up a local caching proxy to
serve these resources internally. While this adds some setup, it ensures your environment remains secure and
fully operational without external connectivity.
Key benefits:
- packages are downloaded only once - once fetched, they're distributed via local
network, dramatically speeding up multi-host updates,
- new host deployment (e.g. from templates) becomes instantaneous since all packages are
already locally available,
- no tooling changes required - everything works exactly the same way (just standard
apt update && apt upgrade
),
- container images are also cached (Docker/Podman/Kubernetes), using
Harbor.
It’s not as hard as you think... Let's go! 🧑🏭
Czytaj więcej
31.05.2025r.
Introduction
Throughout my many years in business, I have constantly encountered servers connected to the Internet...
Although it is not necessary. Keeping your server offline is one of the most effective ways to avoid serious
security issues such as data leaks and remote takeovers.
What is the most common reason for connecting servers to the Internet? Well, the most common reason I hear
is "I have to update it somehow."
There are a few ways to update offline servers very conveniently. Today I'll show you how to do it very
easily using ssh
. It's not the optimal technique for many servers, but you might
find it useful.
A more efficient method for updating multiple hosts is described in my other article about why
you should keep most of your infrastructure offline.
Czytaj więcej
11.10.2023r.
It's been a while since the premiere of the latest processors for the AM4 platform. That is exactly why it
could be the best moment to build a powerful workstation based on this platform (as it comes to performance +
stability vs. price). PCI-e 4.0 NVMEs are cheap and have successfully passed the test of time, not to mention
DDR4 memory which is widely available and also relatively cheap.
It so happens that I have some AM4-based equipment left in the lab - it is a great opportunity to start a
new series about building the ultimate GNU/Linux workstation from scratch.

Have you ever wondered what really affects the speed of your computer? Is it a CPU? Or RAM? Not at all...
The perceived speed of your system depends mainly on the speed of your storage. I bet you have an intuition
that the feeling of using a computer with the latest processor and an old HDD would be very bad.
I will try to demonstrate how to push the speed of mass storage to its limits using a regular home
PC. I can assure you that this option is much cheaper than buying a new computer with PCIe 5.0 and
what's even more important - the result is much better!
The goal of this series is to present an approach that will lead to building a stable and powerful
workstation capable of achieving enormous storage speeds (~30GB/s and more).
All this using Free (as in freedom) production-ready, server grade technology - GNU/Linux.
Czytaj więcej
02.01.2023r.
This part of the SSH series will cover the configuration of the OpenSSH client.
Configuration sources
The ssh
client obtains configuration data from the following sources (in the following
order):
- command-line options,
- user's configuration file (
~/.ssh/config
),
- global, system-wide configuration file (
/etc/ssh/ssh_config
).
The client will use the first obtained value for each parameter. Configured options will be
used not only by the ssh
itself, but by a lot of other tools as well. That is, among others:
scp
, sftp
, sshfs
, git
, ansible
and any other
tool which uses the OpenSSH
library/suite.
Czytaj więcej