Show newer

When you want to maximize the uptime of your servers, you need a way to update the kernel to fix security vulnerabilities without having to reboot the machines.

So here is how to patch the Linux kernel without reboot with Ubuntu livepatch

kerkour.com/linux-update-kerne

Life is too short to manually upgrade the packages of your machine twice a week, so here is how to automate the software updates of an Ubuntu server

kerkour.com/ubuntu-linux-autom

A few weeks ago, I wrote: "I believe that Rust moves too fast" and "a programming language is a platform".

Some people raised good objections, so I thought it would be good to write a follow-up

👉 kerkour.com/programming-langua

Over the decades, Humans have proved to be pretty bad at producing bug-free software

What if we could have an always available companion that would help us to avoid bugs in our software before they reach production?

👉 kerkour.com/bugs-rust-compiler

Building a crawler in [4/6]

Now that we have a fast concurrent crawler in Rust, it's time to actually parse the HTML and turn it into structured data (remember, this process is called scraping).

📖 kerkour.com/rust-crawler-scrap

I recently learned about the existence of the NPM ci command, so I wrote a short summary about the differences between NPM install and NPM ci

kerkour.com/npm-install-vs-npm

Is it time to rewrite everything in Rust?

is seen by many as the least bad language but is still far from perfect.

So here is what I think it would take to make it the "perfect" language

📖 kerkour.com/what-a-better-rust

Last week we saw which language's features we are going to use to implement our crawler in Rust, so today we start to actually implement it 👨‍💻

kerkour.com/rust-crawler-imple

There is no one month without some popular dependencies found to be compromised or backdoored.

Let see how hackers get write access to software packages in practice 👉 kerkour.com/supply-chain-attac

Scraping is the process of turning unstructured web data into structured data

Crawling is the process of running through a lot of interlinked data (web pages for example)

So let see how to create a crawler in

kerkour.com/rust-crawler-assoc

Scraping is the process of turning unstructured web data into structured data

Crawling is the process of running through a lot of interlinked data (web pages for example)

So let see how to create a crawler in

kerkour.com/rust-crawler-assoc

Some days I start thinking about if life is too short to fight 's borrow checker 👉 kerkour.com/life-is-short-rust

What do you think?

Concurrency issues are the fear of a lot of developers. Due to their unpredictable behavior, they are extremely hard to spot and debug.

Here is how Rust prevents concurrency issues in practice: kerkour.com/rust-fearless-conc

From lone wolves to teams of hackers, developers to analysts, the profile of attackers is highly diversified.

Here is a short summary of the different skills needed to carry offensive (cyber) operations: kerkour.com/profiles-cyberatta

I sincerely believe that is a huge step forward in terms of software reliability and performance, which directly translate to $$ and time saved

But like all technologies, it has drawbacks that may not make it the best choice for your project. Today I want to explore what I think are bad use cases for Rust.

👉 kerkour.com/why-not-rust

is designed by a committee, by choice.

If you ever have managed a project, you should smell the unfocused monstrosity from 100 KM away.

And yet, after many years, I've come to the conclusion that in Rust's case, it's a huge asset instead of a liability.

📖 kerkour.com/rust-is-minimalist

Turning a web browser extension into a botnet and exfiltrating sensitive data used for may be closer to reality than you think.

📖 kerkour.com/hacking-stories/pu

I'll be honest, I never successfully learned , the holy grail of functional languages

But, a language perfectly mixing imperative and functional programing now exists

You got it, we are talking about
👉 kerkour.com/rust-functional-pr

Job queues are a central piece of any web application but they come with a high operational cost

What if instead of adding another piece we could use something we already have?

I’m talking about our old friend

👉 kerkour.com/rust-job-queue-wit

Show older
Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!