2024-01-26

Drained weight

 

Image from Unsplash

It's crazy that as a citizen you have to worry about your privacy. In the past, when Roger Moore still was James Bond, you only had to worry about external interest in your doings if you were a special company or a government. But nowadays? Everything has a privacy policy these days. And that means that your privacy is at stake everywhere. Otherwise that policy would not be necessary.

Well, the tone has been set for European Privacy Day, January 28. Apparently that day is necessary, too. Witness also this musing of Omri Elisha, professor of anthropology in New York:

We memorized phone numbers.
We memorized driving directions.
No one knew what we looked like.
No one could reach us.
We were god.

In those days, as a child you played outside with your friends, randomly ringing their doorbells or finding them somewhere outside. As a boy you wore rubber boots and preferred to play at the local mud puddle. At most you had a watch and a time when your mother told you to be home (and hopefully there was time taken into account to get you to the table clean). Yes, we were those gods, we just didn't realize it.

As a parent I look at this differently. It's quite nice to have your children under the digital button - at least when they respond to you. Are you worried because they are not home yet, or do you want them to run an errand? Sending an app usually works wonders. Are they going somewhere? They can then text that they have arrived safely, or they share their live location so that you know where they are in case of emergency. It also works the other way around: if help is needed, mom and dad are easily accessible. The price for this comforting technology is privacy. But because the children of this century don't know any better, they don't miss it.

Nowadays one hardly buys any device with a power plug that is not subject to a privacy policy. If you do not agree to it, you cannot use it. Not a soul reads it, everyone blindly agrees. If only because they are always those long, tough stories. You almost wish it just said: All data that this product collects about you and your environment may be used at the sole discretion of the manufacturer and all its business partners. I know of one case where this actually happens. If you travel to the US and come from a friendly country, you do not need a visa. Instead, you can simply apply for an ESTA (Electronic System for Travel Authorization) online. If you enter that process, you will receive an unmistakable security notification, which starts as follows:

You are about to access a Department of Homeland Security computer system. This computer system and data therein are property of the US Government and provided for official US Government information and use. There is no expectation of privacy when you use this computer system. The use of a password or any other security measure does not establish an expectation of privacy.

It's that simple: don’t expect any privacy when using this system. Even security measures that might give the impression of privacy are not there for your privacy. It reminds me of the greeting of the Borg in Star Trek (see this Security (b)log). Fortunately, how different things are with our own government, where people generally do their utmost to guarantee our privacy.

I recently wanted to return a product. The webshop was to send me a DHL shipping label. I received an email from DHL containing not only my shipping label, but also those of a few other customers. The webshop itself had not received those labels. It’s just a small thing, but it does indicate how easily personal data can leak.

The drained weight is stated on vegetable jars - how many grams of vegetables are in it, without the liquid? Perhaps websites should also place such a notice: given our security level, there is a 5/25/50/75/100% chance that your data will go down the drain.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

2024-01-19

Stairway to poetry

 

Image from author

The Hague, Ministry of Justice and Security. From the top floor, the 36th, you have a magnificent view of the surrounding area. Even on a meteorologically challenging day like last Monday, with alternating sun, snow showers and strong winds. If you have a meeting here, you have to accept some loss of time due to looking outside. But there is more to experience.

A stairwell in an office building is often boring, because, especially in a high-rise building, hardly anyone goes there. But there, at Turfmarkt, they wrote sayings on the risers of the stairs. The following is written on the stairs between the 35th and 36th floor: Accidents are just around the corner. Happiness is everywhere else.

For people like me, who are professionally concerned with anything that can go wrong, this puts things into perspective, perhaps even more than for 'ordinary' people. We are looking around the corner, searching and picking, while in general we see relatively little serious misery. Yes, there are regular news reports about data breaches, ransomware and DDoS attacks, and criminal phishing actions, but most of the time, they are not disruptive. Even in Ukraine, which has been suffering from war violence for two years now and where the cyber part of the war started much earlier, the digital society is still up and running. It seams unbreakable. So you might think that in general, we don’t go around that corner, but instead we go everywhere else.

Whenever something like this comes up, I like to recall the year 2000, or more precisely: the turn into the new millennium. That is almost a quarter of a century behind us, which means that there is now a working generation that has not experienced this transition. Well, guys, there was a lot of fuss going on, and that fuss had a name: the millennium bug. While you may be reading this blog on your smartphone, which is in fact a pretty powerful computer, it's hard to imagine that computer memory was a scarce resource in the last century. Today a gigabyte is the smallest unit we talk about, but back then it was kilobytes. That makes a difference of six zeros, or a factor of a million. While you can now buy a 64 GB USB thumb drive for less than a tenner, we used to have to make do with 512 KB floppy disks, which you bought in boxes of ten. The next generation, which could store 1.44 MB (more than twice as much!), felt like a major leap forward. When installing an application on your PC, you were a disk jockey: those products came on a stack of floppy that you had to insert one by one. Downloading had yet to be invented.

Storage memory was in short supply, and it was skimped on wherever possible. For example on date fields. Why would you write 1977 if 77 was sufficient? This was common even in the real world: I learned the date format 24-5-'65 at school. The apostrophe indicated the century, but you could just as easily leave it out. In computers it would save you two positions for each date. But as the turn of the century approached, a problem came into view. Suddenly 31 would no longer necessarily mean 1931, but could also be 2031. Computers would choke on this, for example if they had to sort data. Heaven and earth were moved to avert disaster. In the Netherlands, an estimated nine billion euros were spent on this, and worldwide three hundred billion dollars, according to Wikipedia.

When the gunpowder fumes from the fireworks had dissipated, it turned out that very little had gone wrong. Then there was a lot of criticism: did we spend all that money for nothing? I still get quite excited about so much naivety. Why do you think things went so well? Because of that great effort of course! It's as clear as day: there’s a problem, you solve it, danger averted. At this level of abstraction it doesn't get any harder than this.

Back to the stairs of the ministry. That saying is wrong. An accident being just around the corner means that mischief is very likely to happen. The second sentence of the stair writings, on the other hand, pretends that hardly any accidents really happen and that most things go well. The fact that things are going relatively well in the digital society is due to all the measures taken to prevent problems, and to a quick, adequate response if something does happen. The saying on the stairs should therefore read: Accidents are just around the corner. Grab a broom and sweep that corner clean.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

2024-01-12

Rainbow

 

Image from Pixabay

Recently, there was a newspaper article about armored passenger cars. Or rather: about the 'best secured passenger car in the world'. Due to all the extras, the colossus weighs around 4,500 kg (9,900 lbs), which means you are not allowed to drive it with a regular passenger car driving license in the Netherlands. Part of the weight is in the windows, which are up to ten centimeters (four inches) thick. But of course, quite thick steel is also involved. The doors alone weigh 200 kg (440 lbs). Per piece, that is. The car is made in Sindelfingen, Germany and is called Mercedes S680 Guard.

But rest assured, this did not suddenly become a car blog after the New Year. No, the trigger for writing a blog in response to that newspaper article was a German word from that article: Beschussamt. Chances are you don't even know how to pronounce that (‘be’ like in begin, ‘schuss’ like shoes, but shorter, ’amt’ with the British a in tomato), let alone what it means. Let's start at the back: an 'Amt' is as much as a service or authority. And 'Beschuss' means shelling. So in a literal translation you end up with something like 'shelling service'. The newspaper found a neater translation: firearms authority.

What does a firearms authority have to do with cars? Well, my own translation wasn't so bad in that respect: they are literally shooting at those cars. Because those cars want to be certified, of course, and you obviously won't get that certification just because the brochure states that the vehicle can withstand bullets from a Kalashnikov. They would like to see that with their own eyes at the Beschussamt, and moreover, there are formal standards for the protection factor of a car. And that is why they empty their weapons at those cars and then investigate what they have done to it.

I can now go in two directions with my blog: I can talk about certification, or about testing. You know what, I’ll do the second; just because it's more fun. With those cars, the bullets can come from two sides: from the good guys (the Beschussamt) and from the bad guys (anyone against whom the person being transported in such a car wants to protect themselves). You can look at IT systems in a similar way. Although bullets are not usually literally fired at them, there are two parties that are interested in the resistance that the system offers. On the right side we have the owner of the system, and on the wrong side everyone that owner wants to protect his system against.

But wait a minute; there are more parties on the right side. There is also a whole army of volunteers who look for weaknesses in systems and, if found, dutifully report them to the owner, without abusing the vulnerability found. They are traditionally called white hat hackers, by analogy with the color of the hats of the good guys in spaghetti westerns. A more modern term for this is ethical hacker. Whatever you call them, these people can try to penetrate that system completely without the knowledge of the owner of a system.

A system owner can of course also order his system to be tested. He can have this carried out by his own employees, but an entire industry has also emerged around testing systems: you can simply hire ethical hackers (although it is very pleasant and useful to have a few of them on your payroll). Whoever does it, they perform a so-called pen test. That has nothing to do with stationery, but is short for penetration test – they try to get into your system. You also come across the name A&P test; this stands for attack & penetration – of course a pen test involves an attack.

First you need to decide what their starting point will be: do they get virtually nothing upfront, do they get some more information and an account, or do they get full access and technical and design information? Like everything in this life, pen tests also come in colors: the first kind of test is called a black box (the system to be tested is largely a black box for the hacker, so he knows nothing and doesn’t have access), the second is a gray box pen test and the latter is called white box or – much nicer but not a color – crystal box. Why would you do the latter? There's no point in that, if the hacker already knows everything and gets free access, is there? Well yes, actually: the system is tested with knowledge that a malicious outsider does not have. That can certainly be useful.

 

There are even more colors that are used when conducting exercises. The attackers are on the red team, the defenders on the blue team. And then there is a hybrid called, yes, purple team; In that composition, attackers and defenders learn from each other. During such an exercise, the red team can, for example, perform a crystal box pen test, which will hopefully be seen and averted by the blue team, after which they, as a purple team, discuss what they encountered. You see, the industry has managed to come up with a nice set of terms that are incomprehensible to outsiders. And I haven't even highlighted all the colors and all the aspects.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

2023-12-15

Noise box for 007

 

Image from author

What do Desmond Llewelyn, John Cleese and Ben Whishaw have in common? Well, they all played the role of Q in James Bond films. You know, that grumpy man who provides Bond with all kinds of technical gadgets, such as shoes with a poisonous blade incorporated into them, a lipstick bomb and a watch with a powerful built-in laser. Not that I'm a big 007 fan or a Q-groupy, but I recently came across something special that wouldn't look out of place in the arsenal of a double-zero agent.

It looks like a sushi box, someone said. But a heavy one, because the case weighs 600 grams (1,3 lb), partly due to the thick bottom and ditto lid. The lid closes hermetically and the round thing on the front says automatic pressure purge. Inside you will see a small switch, plus and minus buttons and a few LEDs. No, this is not a sushi box. This is really something from Q's lab.

If you turn on the device - because that is what it is - and close the lid, you will hear noise. The volume buttons give you six different settings; no matter what setting you select, you can hear the white noise through the closed box, and in the loudest setting it is downright annoying. As soon as you open the lid, the noise stops.

Have you figured it out yet? I'll just tell. This thing is meant to store your phone during confidential meetings. The noise ensures that any eavesdroppers who have hacked your phone to secretly eavesdrop on you will only hear noise. And through the transparent lid you can see when something arrives on your phone, for example a message or a call. That is the advantage of this box over a Faraday cage, which blocks all electromagnetic radiation and actually creates an airplane mode environment - although it cannot be ruled out that malware makes a recording and sends it later. In short, with this box you are accessible and uneavesdropable at the same time. Wow.

I can totally see it. James Bond in M's office, who is about to reveal the next assignment. But first the phones go into a box like the one you see. Because that meeting is, of course, top secret. And officials like those two are by definition a target to the kind of hackers who have the knowledge and resources to plant eavesdropping software (I'm thinking of our beloved state actors). And of course our film heroes must be reachable at all times, because their American colleague Felix Leiter may call with important news.

In real life, the market for this product will also be the world of spies. In addition, top industrialists and other people who know something that others would also love to know will also be among the customers of the Dutch company that developed this thing. You are less likely to encounter it in an online store with nice gift ideas for Christmas, if only because it seems to be quite pricey.

In less exciting ecosystems, they use a poor-man's version of this high-tech device: a preserving jar. You know, one of those glass jars with a rubber ring and a snap closure, intended for preserving fruit and vegetables. Well, you can also put your phone in such a hermetically sealed jar, while it still remains visible (but first remove the food and clean it thoroughly, please). Due to the lack of a preserving jar, I cannot test whether this contraption is soundproof, but I do want to believe that its use is not pointless. If only that awareness about confidentiality gets a boost when there are suddenly preserving jars on the conference table.

Happy holidays from Borsoi, Patrick Borsoi.

The Security (b)log will return next year.

 
And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

2023-12-08

USB condoms

 

Image from Pixabay

Customs wondered whether they could charge their mobile equipment at public charging points. That question came to our team and when we talked about it, I was looked at favorably: something for a blog?

Of course, we could easily have answered, “No, don't do that!” (And I will definitely do that later on.) But it is of course much better to explain the ins and outs of the matter, and what alternatives there are. And it would also be a shame to only serve my colleagues in the once green uniform, while this is important for everyone - and also for you privately.

This concerns charging via a USB cable. Something you probably do every day with your phone or tablet - even if you have an iThing from Apple, because although the slightly older iPhones and iPads do not have a USB connection, but instead a Lightning connection, which plug is there again on the other side, on the side of the charger? That's right, USB-A! And what's so dangerous about USB? Well, it can do more than just charge: you can also send data through it. Perhaps your printer is connected to your PC via a USB cable, or your laptop is connected to an external screen with such a cable. Here is proof that data is passing through your USB connection. So what? Ah, now we're touching on my area of expertise. If data can flow somewhere, it can do so without you noticing. And that can have consequences for the confidentiality of your data, or that of your employer. Data can of course be anything: photos, contacts, texts, spreadsheets, you name it. All things digital.

Criminals know that too. On the pretext of 'data is the new oil' (in other words: you can make a lot of money with it) they like to explore new paths. And what does that have to do with those public charging points that Customs asked about? Well look, such a public charging point is a USB socket on the train, the bus, in a hotel room, you name it; you see them everywhere these days. Or it's a USB cable dangling somewhere. Sometimes you’ll run into those small lockers for charging your phone (I even saw them once at a security conference...). The problem with all those generous electricity suppliers is that you don't know what – and who – is behind them. And here's the thing: you can add something to those sockets and cables, or plug something in it that is more than just a charger. There are even cables available with plugs that transmit information to their owner via WiFi. All this outlines the risk scenario at stake here: that someone steals data from your device via a seemingly innocent, free charging option. This phenomenon even has a name: juice jacking. Your data is being kidnapped via the power cable.

However, in more than nine out of ten cases, such a public charging point will not be a problem at all. I don't see a hacker easily taking a train apart, hiding something in a USB port and then hoping that one day someone with important information will connect their phone to exactly that charging point. With a power cord dangling from a well-intentioned pole in the city, or in one of those charging lockers, it’s a different story, because they are much easier to manipulate. The majority of victims of these attacks, however, are targeted, because they possess specific information. When I talk about targeted attacks to my primary audience, I always mention two organizations: Customs and the FIOD (the Dutch Fiscal Information and Investigation Service). Both have information that is interesting for criminals, and for sure Customs officers sometimes make their appearance abroad, and that sometimes makes things a little more exciting.

What can you do to avoid the use of public charging points? Leave home with a full battery, and if you know you won't make it, be prepared: bring your own charger and cord. Going somewhere where you won't find an electrical outlet? Then put a power bank in your bag. Preferably a slightly more expensive one, that charges your device quickly. If you really cannot avoid a public charging point, use a USB condom (or a juice-jack defender, if you don’t like the former term). That’s a plug that goes into your device and takes that public charging cable in the other end. USB condoms only allow electricity to pass through, not data. Never use a charging cord or charger that you found somewhere; they may not have ended up there by accident. And if your device lets you choose between data transfer and 'charging only', choose the latter option.

Well, as predicted above, it comes down to this: just don't use public charging points. Nowhere, never, even if you are 'not important'. If you apply that principle, you will never have to think about it again.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

2023-12-04

Quantum pathfinder

 

Photo Petra Wevers

The Dutch word ‘kwantum’ easily translates into the English quantum, meaning quantity, although I mainly think of large quantity. This is probably due to the term quantity discount: buy a lot of something and it becomes cheaper. There is also something orange in my mind's eye, and that is due to that Dutch home furnishings store chain with its orange logo, which once started under the name Kwantum Hallen (‘Quantum Halls’).

For some time now, the word has been buzzing around the international IT community in its English spelling. It's all about the quantum computer, that strange machine that came straight from the film set of Back to the future, with its system of elegant pipes that provide cooling. Because the quantum computer likes it cold: in the heart of the machine the temperature is only ten milliKelvin (a tiny bit colder, 0 K or rounded off -273 °C, is absolute zero: it can't get any colder). 'Quantum' in this context depicts not at lot, but rather revolves around minimal quantities.

In addition to its bizarre appearance and the conditions required to function, the quantum computer has another peculiar property. As long as computers have existed, we have been used to the bit: a value that can be 0 or 1 and with which the computer can do calculations. But that crazy quantum computer works with qbits, which can be 0 and 1 at the same time, and everything in between. Until you look at it, because then the qbit has to show its colour. Sort of like Schrödinger's cat, which is in a closed box and is therefore simultaneously dead and alive to an observer, until the moment he opens the box and determines the state of the animal. With those qbits you can perform some calculations very quickly, because you can follow multiple paths at the same time. While ordinary computers work according to the pattern 'if this is true, then do this, else do that', the quantum computer simply does both and ultimately sees where it ends up. As a result, it makes many mistakes, but because it performs the calculations very often, a winning outcome emerges.

I talked about this with our brand new team member Petra Wevers, who calls herself a pathfinder in the field of quantum security. Quantum computers threaten the current way we protect our data which is, to a very important extent, based on a complex mathematical problem. To encrypt files you need keys, which are created by multiplying very large prime numbers. An attacker who wants to obtain the key does have the outcome of that calculation, but finding the two prime numbers (factorization) is extremely difficult. At least, for regular computers. For quantum computers, however, it is a piece of cake. The quantum computer therefore poses a major threat to the confidentiality of our data.

Current quantum computers cannot yet do that. Predictions vary widely, but you often hear that it will take somewhere between 7 and 10 years. Elsewhere I learned that from 2030 there is a real but small chance of breaking cryptography. Breaking RSA 2048 (a certain cryptographic algorithm, with a key length of 2048 bits) is expected to require a quantum computer with a million qbits, while the most powerful known (!) computer has only 433. Oh, you think, so we're not in a hurry. Think again. A lot of information that is confidential now will still be confidential in ten years. Long-term attackers, such as certain countries, are already stealing that information, even though they can't do anything with it yet. But if they can read that information a decade later, it will still be useful to them. Steal now, decrypt later, is their motto. Petra calls the situation we are in now the quantum squeeze. Others talk about Qday or even the Quantum Apocalypse, but it all comes down to the same thing: we have to do something before it's too late. And we have to act now.

We do not yet have quantum-safe cryptography, and the route to it has not yet been crystallized, says Petra. There are stopgap measures. Making keys longer, for example, so that even it will even take a quantum computer a while to figure them out. And – allow me to get specific for a moment – switching to TLS 1.3, because previous versions, which are still in full use, cannot handle hybrid algorithms (an accumulation of different algorithms). In addition, we can also be quantum annoying by frequently changing keys, so that the quantum computers choke in a tremendous workload. And if you as an organization purchase items, include quantum safety in your requirements. Ask your suppliers about their plans in this area.

Governments and science are serious about our safety, says Petra. Such as in the Dutch Quantum secure Cryptography Gov program. Next year, NIST (the American Standards Institute) will publish standards in this area, which are expected to be incorporated into commercial products three years later. According to Petra, it is generally overlooked that soon everyone will be able to work on quantum computers via some website, including criminals. Just as we can now all use artificial intelligence. It is not all doom and gloom: quantum computers, for example, will also help in the development of new medicines and batteries, it is expected. Let's fight to ensure that the positive use of this groundbreaking technology wins.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

2023-11-17

Kafka's Castle

 

Image from Pixabay

Remember that castle I wrote about last week? Where they didn't trust anyone, because they assumed the enemy was already within the walls of the castle? I went for a walk around the area, and guess what? There is another castle just down the road. And they do things in a completely different way there.

Not so long ago I heard this statement at a conference: we must move from 'low trust, high tolerance' to 'high trust, low tolerance'. That’s one of those statements to which the audience mumbles in agreement, without yet understanding exactly what it means. I make a note of those kinds of statements to think about them later. Writing a blog is an excellent way to hatch an egg like that one. Buckle up, dear reader, because at this point I don't know yet where the story is going.

The statement contains the assumption that many organizations work from a kind of non-trust (which is different from distrust), much like in last week's castle. There are many rules that you have to adhere to, because you probably won't do the right thing on your own. Not because you don't want to, but because you can't know them all. And because there are so many rules, it is very difficult to adhere to them all. If only because you do not know all the rules, but also because some rules are not feasible, or because it is sometimes inconvenient. You know, that word 'actually'. Whenever someone says that something shouldn’t actually be done in that way, you already know that a rule will be worked around. The lord of the castle knows this too, and therefore turns a blind eye to many things: he is very tolerant, as long as the rules are not broken deliberately and with malicious intent.

The statement from the second paragraph implies that that attitude is not good, because well, we 'must' move towards that other model: high trust, low tolerance. This lord of the castle assumes that everyone who works for him understands very well what is and is not possible, because many things are obvious. When you enter somewhere, you close the door behind you. Not only because otherwise it would be draughty, but also because someone might slip in who shouldn't be there. If you’re in charge of the lady's jewelry, you probably understand that you are not supposed to lend them to your girlfriend for an evening. So there are far fewer formal rules, but woe betide you if you betray trust and they find out. Then you'll be in the dungeon on bread and water in no time. There is little tolerance.

Do you know Franz Kafka's novel Der Prozess (The Trial)? That story revolves around Josef K., who is arrested and ultimately convicted without ever knowing why. Apparently he sinned against rules that he did not know – even could not have known. We could easily end up in such a Kafkaesque situation if we work on the basis of 'high trust, low tolerance'. Not a nice place to live, that castle.

What about a middle ground? I call it 'some trust, some tolerance'. It is probably true that we have too many rules, which no one knows anyway. Every citizen is supposed to know the law, they say. But how realistic is that, if taken literally? Even without knowledge of the law, you know that you are not allowed to puncture car tires, right? Likewise, there are numerous security rules that people adhere to anyway. Or where a little more tolerance wouldn't hurt. It annoys me every time when the app, in which I can see my daughter's class schedule, kicks me out if I haven't checked the app for a few weeks. Then I have to log in again, and then I always have to figure out how that works, because it works differently than elsewhere. How exciting is what's in that app? Let it piggyback on the security of my phone. Even my bank's app is easier (after an initial strict admission procedure).

So we can probably get by with fewer rules, but we also have to learn to be less tolerant. Still too often someone does something in a way that they know perfectly well is not the way it should be, but – of course with the best intentions, no doubt – they still manage to do it in that way. It works, but there are too many risks involved that may have been overlooked. Tolerance should not be taken, it should be given. From the person who is responsible.

So we need a new castle, at an appropriate distance from Kafka's. With residents who reasonably adhere to rules that mainly regulate what is not obvious to everyone. That model will only work for people, by the way. Let’s stick to zero trust for systems.

Next week there will be no Security (b)log.

 

And in the big bad world...

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

Champions

Photo by author   I love this traffic sign. In other European countries, the warning for playing children is a neat triangle, just like all ...