2023-01-27

Scattered clouds

 

Image from Pixabay

Just when I had pretty much accepted that the statement “the cloud is someone else's computer” is very boomerish, an important cloud service collapsed on Wednesday: Outlook, Teams and other Microsoft services no longer worked. What is it with that cloud?

In 2016 I gave a presentation entitled The cloud is not a light cloud dessert (‘cloud dessert’ is a straightforward translation of the Dutch ‘wolkentoetje’, which is a fluffy dessert here in the Netherlands). The title slide featured a photo of Captain Kirk from the science fiction series Star Trek, followed by that series' intro. I slightly modified the epic words spoken in the intro in my subtitles:

Cloud: the final frontier

These are the storages of the computing enterprise

It's never-ending mission

To explore strange new servers

To seek out new privacy and new legislations

To boldly go where no byte has gone before.

See, that cloud is someone else's computer, that's just a fact. It simply means that you do not use your own equipment, but – depending on the chosen model – you use the infrastructure, a development platform or a complete application for end users of your cloud supplier. As a private person you are mainly familiar with the latter variant; chances are that the photos you take with your phone are stored in the Apple or Google cloud – and not on the phone itself. Your Word and Excel files are no longer on your laptop, but in the Microsoft cloud. LinkedIn, WhatsApp, Twitter, Zoom, Teams, Netflix: all of them are cloud services.

Why do companies use the cloud? Suppose you have a company that receives an enormous number of customers once or a few times a year, much more than in the rest of the year. Think, for example, of online shops around the holidays, the tax authorities during the period when everyone files a tax return or a ticket seller for a world star concert. You must have experienced that such a site told you: sorry, currently too busy, please try again later. That situation will occur more likely in organizations that have all the equipment under their own management, in their own data center. They have a limited amount of servers and storage and network capacity there. To avoid this, such a company would have to oversize its data center. A lot of equipment is just sitting there for a large part of the year.

The tempting thing about the cloud is that you purchase their services as needed, and that you can scale up and down quickly. The cloud is elastic, as they say. Cloud providers have huge data centers, with which they serve many customers from all over the world. Because they are so large, and not all customers peak at the same time, they can distribute their enormous capacity among all those customers. If one asks for more, it will not be at the expense of another customer. In addition to this flexibility, the cloud has another important advantage: you do not have to maintain and secure everything yourself. Moreover, for many organizations, a cloud supplier can do this much better than they could do themselves.

But then something like last week happens. Azure, Microsoft's cloud service, had an outage that affected users worldwide. That is quite exceptional, because the major cloud suppliers have built data centers all over the world, which also work as each other's backup. But in this case there was a network problem, which also affected the link between those data centers. If something like this happens in your own data center, only your customers will be affected. Many companies with their own data center are more likely to have disruptions affecting their customers than companies that live in the cloud, because of the elasticity and flexibility of the cloud. But the number of affected customers is much smaller: only the customers of that company are affected. A comparison forces itself upon us: flying is much safer than driving a car, but if an airplane crashes, there are often many casualties.

In my Star Trek intro I mentioned 'strange new servers'. The word ‘strange’ has multiple meanings. But 'unknown' in particular applies here: the cloud is a black box for us into which we put things, hoping that we will also get something out of it when we need it. If it fails to do so, you are just as powerless as if you were on a stranded train. It's just a matter of how comfortable you feel about that.

 

Solution

Last week I challenged you to discover which parts of the blog were written by me and which by ChatGPT. You can find the solution here.

 

And in the big bad world…

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

Privacy chat - the solution

 

Image from Pixabay

Last week’s Security (b)log came with a challenge: can you spot which paragraphs were written by me, and which by ChatGPT? Well, the blue paragraphs are artificially intelligent, the black ones are mine.

Two weeks ago I read with fascination the (internally published) story of fellow blogger Guido about ChatGPT, the artificially intelligent chatbot that anyone can use. Until now I knew chatbots mainly from companies that want to prevent you from calling them, and instead offer you to exchange thoughts with their always cheerful canned employee via the keyboard. But if I do have a question for a company at all, it rarely falls into a category that Clippy's grandchildren can handle; otherwise I would have found the answer myself in the FAQ.

Do you remember Professor Barnabas's computer from the Spike and Suzy (UK)/ Willy and Wanda (US) comics? It spat out a card with the answer to the question asked. I wanted to experience for myself how this works in practice. January 28 is European Privacy Day, and I promised the privacy colleagues in my team that I would dedicate a blog to it. As an experiment, I outsourced its the writing for once. However, not entirely: the following paragraphs are partly written by ChatGPT, partly by me (in the ChatGPT pieces I only intervened grammatically here and there). Can you recognize who wrote which paragraph? I will give the solution next week.

As a European, I am proud of the European Union and its role in protecting our personal data. With the introduction of the General Data Protection Regulation (GDPR) in 2018, the EU has introduced one of the strictest privacy laws in the world. The GDPR gives consumers more control over their own data and makes companies responsible for protecting those data.

While the GDPR is an important step forward, the battle for our privacy will never end. Technological developments are going faster than ever and with the rise of artificial intelligence and the internet of things, there are new challenges we face. Businesses and governments must continue to adapt to the changing landscape to protect our data.

Let's zoom in on artificial intelligence as a threat to our privacy. Sometimes you can benefit from a company or government agency not knowing everything about you; in countries with an oppressive regime this applies even more than in our western democracy. As long as only people and traditional information systems are involved, things are sometimes overlooked. But what if artificial intelligence were able to link all kinds of sources together in an unprecedented way and make connections that you, as a human being, or a 'dumb' computer, would never have thought of? Is that still fair? Aren't we entitled to a lucky break from time to time, because 'the system' overlooked something?

It is not only the responsibility of companies and governments to protect our privacy. As consumers, we also have a role to play. We need to be aware of our own data and how we share it. We must ask questions about how companies use our data and we must oppose inappropriate practices. Let's work together to protect our privacy so we can feel safe and protected in our digital world.

Another important step is the use of privacy-enhancing technologies, such as Virtual Private Networks (VPNs) and encryption tools. These can help protect our online communications and protect our personal information from being intercepted by hackers or other malicious actors. It is also important to keep abreast of privacy issues and to advocate for stronger privacy protections on a national and international level. For example, by supporting organizations that are committed to defending privacy rights, and by speaking out against government surveillance and data collection practices that threaten our civil liberties.

In 2009, the scientists Murphy and Woods formulated the Three Laws for Responsible Robotics, following the Three Laws of Robotics by the science fiction author Isaac Asimov. The first of their laws reads: “A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.” Maybe we should also apply this law to artificial intelligence, at least with regard to ethics, and to prevent chatbots from threatening our privacy.

 

2023-01-20

Privacy chat

 

Image from deperfectepodcast.nl

Two weeks ago I read with fascination the (internally published) story of fellow blogger Guido about ChatGPT, the artificially intelligent chatbot that anyone can use. Until now I knew chatbots mainly from companies that want to prevent you from calling them, and instead offer you to exchange thoughts with their always cheerful canned employee via the keyboard. But if I do have a question for a company at all, it rarely falls into a category that Clippy's grandchildren can handle; otherwise I would have found the answer myself in the FAQ.

Do you remember Professor Barnabas's computer from the Spike and Suzy (UK)/ Willy and Wanda (US) comics? It spat out a card with the answer to the question asked. I wanted to experience for myself how this works in practice. January 28 is European Privacy Day, and I promised the privacy colleagues in my team that I would dedicate a blog to it. As an experiment, I outsourced its the writing for once. However, not entirely: the following paragraphs are partly written by ChatGPT, partly by me (in the ChatGPT pieces I only intervened grammatically here and there). Can you recognize who wrote which paragraph? I will give the solution next week.

As a European, I am proud of the European Union and its role in protecting our personal data. With the introduction of the General Data Protection Regulation (GDPR) in 2018, the EU has introduced one of the strictest privacy laws in the world. The GDPR gives consumers more control over their own data and makes companies responsible for protecting those data.

While the GDPR is an important step forward, the battle for our privacy will never end. Technological developments are going faster than ever and with the rise of artificial intelligence and the internet of things, there are new challenges we face. Businesses and governments must continue to adapt to the changing landscape to protect our data.

Let's zoom in on artificial intelligence as a threat to our privacy. Sometimes you can benefit from a company or government agency not knowing everything about you; in countries with an oppressive regime this applies even more than in our western democracy. As long as only people and traditional information systems are involved, things are sometimes overlooked. But what if artificial intelligence were able to link all kinds of sources together in an unprecedented way and make connections that you, as a human being, or a 'dumb' computer, would never have thought of? Is that still fair? Aren't we entitled to a lucky break from time to time, because 'the system' overlooked something?

It is not only the responsibility of companies and governments to protect our privacy. As consumers, we also have a role to play. We need to be aware of our own data and how we share it. We must ask questions about how companies use our data and we must oppose inappropriate practices. Let's work together to protect our privacy so we can feel safe and protected in our digital world.

Another important step is the use of privacy-enhancing technologies, such as Virtual Private Networks (VPNs) and encryption tools. These can help protect our online communications and protect our personal information from being intercepted by hackers or other malicious actors. It is also important to keep abreast of privacy issues and to advocate for stronger privacy protections on a national and international level. For example, by supporting organizations that are committed to defending privacy rights, and by speaking out against government surveillance and data collection practices that threaten our civil liberties.

In 2009, the scientists Murphy and Woods formulated the Three Laws for Responsible Robotics, following the Three Laws of Robotics by the science fiction author Isaac Asimov. The first of their laws reads: “A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.” Maybe we should also apply this law to artificial intelligence, at least with regard to ethics, and to prevent chatbots from threatening our privacy.

 

And in the big bad world…

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

2023-01-13

Why you can't encrypt external email

 

Image from Pixabay

I stumbled into the new year coughing and sniffling, and paracetamol/acetaminophen, cough drops and tea with honey are still my best friends. Reluctantly I float along on the waves of this epidemic. Now that we have put the corona pandemic behind us, it seems that the flu and cold viruses finally saw their chance to strike the old-fashioned way again. Many of you will now nod in agreement. Sigh – if only human defenses worked as efficiently as the virus scanner on your computer.

But no, I'm not going to talk about malware today. The reader's post contained questions in response to the previous Security (b)log. It contained a passage that began: “Our internal mail offers the ability to encrypt sensitive messages.” Colleague Monique has now enabled this option by default, but she receives an error message when she tries to send mail to her private address. She would like to understand what's going on.

It may have been a bit mean, but it was precisely in view of this possible question that I had included the word "internal" in the quoted sentence. This means that you can encrypt mail that remains within the organization – that is, mail to a colleague. As soon as you send email outside, e.g. to your private address or to a customer, we are dealing with external email. There are two reasons why you cannot encrypt that mail.

Information, such as email, is encrypted to ensure that unauthorized persons cannot read the information. Only those who have the key can decrypt and then read the message. Before we send an email from our organization out into the world, we want to check whether that email does not contain any trouble, such as viruses (well, viruses again, right). But if the email is encrypted, then that is not possible - even a virus scanner cannot read encrypted messages. The same goes for attachments. That is why encryption of external email is not allowed. Incidentally, this works in both directions: incoming mail, which is encrypted, is also rejected. That's a pity: encryption is, after all, just like scanning for viruses, a security measure. Here one security measure gets in the way of the other.

The second reason why external email encryption isn't working is in the error message Monique received: "Can't find this recipient's certificate in the address book." Yes, nice, but you can only understand this message if you know how encryption works. In the professional literature, this is always the time for Alice and Bob to appear. This couple wants to exchange encrypted email with each other. To this end, some preparations need to be made. You need keys to encrypt and decrypt, don't you? Two per person, to be exact: one public and one secret, which have a mathematical relationship to each other. They form a key pair. And then it works like this. When Alice wants to send an email to Bob, she encrypts it with Bob's public key. The great thing is that a public key, which is literally available to anyone in the world, cannot be used to decrypt the message. This can only be done using the corresponding secret key, which is held by only one person: Bob. If Bob wants to reply to the email, he in turn uses Alice's public key to encrypt his reply. And because only Alice holds the corresponding secret key, only she can read Bob's email.

Alice and Bob do not have to perform all these actions manually. All they have to do is tick 'encrypt' in their mail program. That email program needs access to the public keys of people you want to email with. These public keys are available in the form of so-called digital certificates, which, in addition to the public key, also state that this is actually Bob's public key. In Monique's case, that certificate isn't available: her private address probably doesn't have a certificate at all, and if it did, it wouldn't be included in our internal address book because we don't allow encryption of external mail anyway. Hence the message about the missing certificate.

Monique had an additional question about signing email. She also ticked that option, but, as she writes: “I see no visible added value and therefore do not understand it.” The word “visible” is spot-on. Digital signing also uses those key pairs. Alice uses her own secret key to sign her mail, and Bob uses her public key to verify that the message really came from Alice and that it hasn't been tampered with along the way. The mail program also does that for you unnoticed, as long as nothing is going on. Only when something is wrong will the mail program sound the alarm.

So if Monique sends an email home, she cannot encrypt that email. That's not too bad, because of course she doesn't send business information home. Maybe a shopping list or something. I sincerely hope there is no cough medicine on it…

 

And in the big bad world…

This section contains a selection of news articles I came across in the past week. Because the original version of this blog post is aimed at readers in the Netherlands, it contains some links to articles in Dutch. Where no language is indicated, the article is in English.

 

Gyro Gearloose

  Image from Pixabay Gyro Gearloose is a crane after my own heart. He can invent a genius device to order, or he has something lying around ...