DEV Community

Linux

What are clouds made of? Linux servers, mostly.

Posts

👋 Sign in for the ability to sort posts by relevant, latest, or top.
Reachy Mini Not Speaking

Reachy Mini Not Speaking

1
Comments
4 min read
I built a tiny Linux tool that shouts “FAHH” when I type the wrong command

I built a tiny Linux tool that shouts “FAHH” when I type the wrong command

1
Comments
2 min read
Day 18 — Building a Linux Vulnerability Analyzer 🐧🔍

Day 18 — Building a Linux Vulnerability Analyzer 🐧🔍

Comments
6 min read
I Built a Chat Server That Cannot Read Your Messages — Here's How

I Built a Chat Server That Cannot Read Your Messages — Here's How

1
Comments
5 min read
Linux Permissions

Linux Permissions

2
Comments
3 min read
Every Hacker Should Build This Active Directory Lab

Every Hacker Should Build This Active Directory Lab

1
Comments
8 min read
Linux Filesystem & Essential Linux Commands

Linux Filesystem & Essential Linux Commands

Comments
3 min read
Day 11: Auditing Linux Privilege Escalation Vectors 🕵️‍♂️

Day 11: Auditing Linux Privilege Escalation Vectors 🕵️‍♂️

Comments
1 min read
Harden Linux Services with `systemd-analyze security`: From Score to Enforceable Policy

Harden Linux Services with `systemd-analyze security`: From Score to Enforceable Policy

Comments
3 min read
Essential Linux Commands- 2

Essential Linux Commands- 2

Comments
3 min read
Designing a Custom SBC with Integrated Display for Industrial Applications

Designing a Custom SBC with Integrated Display for Industrial Applications

1
Comments
5 min read
Understanding `fork()` in Linux: How Process Creation Really Works

Understanding `fork()` in Linux: How Process Creation Really Works

Comments
3 min read
Understanding Linux Boot Memory Management

Understanding Linux Boot Memory Management

1
Comments
5 min read
Stop SSH-ing One by One: Building a Parallel Command Executor in Bash

Stop SSH-ing One by One: Building a Parallel Command Executor in Bash

1
Comments
3 min read
Local LLM Inference on Windows 11 and AMD GPU using WSL and llama.cpp

Local LLM Inference on Windows 11 and AMD GPU using WSL and llama.cpp

1
Comments
3 min read
👋 Sign in for the ability to sort posts by relevant, latest, or top.