My First Lecture at the University of Malta

Snapshot of the title slide captured prior to the lesson.

I initiated the Applied Cryptography course at the University of Malta on Monday evening. As a cyber security professional and academic with a strong commitment to the field of information security, I am genuinely excited to be leading this specialized academic course this year.

Throughout the introductory lecture, I delved into the foundational concepts of cryptology, emphasizing its profound relevance within contemporary security applications. The pedagogical discourse traversed a diverse spectrum of topics, encompassing cryptographic mechanisms, the examination of classical substitution ciphers and their formal representations, a concise introduction to cryptanalysis, and more.

I am excited to be a part of this journey and look forward to the next lecture in this course on Monday!

Data Security and Privacy in the Era of Floating Homes

Slightly over a year ago, I mentioned Ocean Builders’ innovative living pods and how they are using smart home technologies in their vessels. Now, a new contender, Reina, takes the stage. Reina’s flagship yacht home model, the luxurious Reina Live L44DR, showcases not only lavishness but enhanced comfort and convenience also by incorporating smart home functionalities (smart TV, smart speakers, etc.).

The transition from a fixed abode to a mobile dwelling incites inquiry. Can a floating home offer a higher degree of security and privacy compared to its stationary counterpart? Do the potential challenges of connectivity experienced within the realm of floating homes share similarities with those encountered in the context of connected cars and trucks? Beyond concerns about location privacy, the intricate facets of this discourse warrant scholarly exploration, as the enduring appeal of these aquatic residences persists. This theme was also briefly addressed in one of the recent conferences at which I presented.

EU Data Initiatives: Developments to Watch in 2024 and Beyond

The European Union (EU) has been at the forefront of global efforts to protect privacy and personal data. Over the years, the EU has implemented several initiatives and regulations that aim to safeguard the privacy rights of its citizens. The International Association of Privacy Professionals (IAPP) has created a timeline of key dates for these EU regulations and initiatives, including those that are yet to be finalized.

Photo by freestocks.org on Pexels.com

Here are the key dates to watch out for the year 2024 and beyond:

  • February 17, 2024: The Digital Services Act (DSA), which aims to establish clear rules for online platforms and strengthen online consumer protection, will become applicable
  • Spring 2024: The AI Act is expected to be adopted
  • Mid-2024: The Data Act is expected to enter into force
  • October 18, 2024: The NIS2 directive will become applicable
  • January 17, 2025: The DORA regulation will become applicable

In conclusion, the EU’s data initiatives are set to undergo significant changes in the coming years with the implementation of regulations like the DSA, AI Act, Data Act, NIS2 directive, and DORA regulation. These initiatives aim to establish clear rules for online platforms, strengthen online consumer protection, facilitate data sharing, and more. It is crucial for organizations, including individuals, to stay up-to-date with these key dates to ensure compliance with the new regulations and to take advantage of the opportunities they present.

For a more detailed overview of the EU’s data initiatives and their key dates, check out the infographic created by the IAPP here.

The Different Types of Privacy-Preserving Schemes

Machine learning (ML) is a subset of artificial intelligence (AI) that provides systems the ability to automatically improve and learn from experience without explicit programming. ML has led to important advancements in a number of academic fields, including robotics, healthcare, natural language processing, and many more. With the ever-growing concerns over data privacy, there has been an increasing interest in privacy-preserving ML. In order to protect the privacy of data while still allowing it to be used for ML, various privacy-preserving schemes have been proposed. Here are some of the main schemes:

Secure multiparty computation (SMC) is a type of privacy-preserving scheme that allows multiple parties to jointly compute a function over their data while keeping their data private. This is achieved by splitting the data up among the parties and having each party perform a computation on their own data. The results of the computations are then combined to obtain the final result.

Homomorphic encryption (HE) is a type of encryption that allows computations to be performed on encrypted data. This type of encryption preserves the structure of the data, which means that the results of the computations are the same as if they had been performed on unencrypted data. HE can be used to protect the privacy of data while still allowing computations to be performed on that data.

Differential privacy (DP) is a type of privacy preservation that adds noise to the data in order to mask any individual information. This noise is added in a way that it does not affect the overall results of the data. This noise can be added in a variety of ways, but the most common is through the Laplace mechanism. DP is useful for preserving privacy because it makes it difficult to determine any individual’s information from the dataset. 

Gradient masking is a technique that is used to prevent sensitive information from being leaked through the gradients of an ML model – the gradients are the partial derivatives of the loss function with respect to the model parameters. This is done by adding noise to the gradients in order to make them more difficult to interpret. This is useful for privacy preservation because it makes it more difficult to determine the underlying data from the gradients.

Security enclaves (SE) are hardware or software environments that are designed to be secure from tampering or interference. They are often used to store or process sensitive data, such as cryptographic keys, in a way that is isolated from the rest of the system. 

There are many ways to preserve privacy when working with ML models, each with their own trade-offs. In this article, we summarised five of these methods. All of these methods have strengths and weaknesses, so it is important to choose the right one for the specific application.