A Microelectronic “Canary in a Coal Mine”

A Microelectronic “Canary in a Coal Mine”
A Call to a New Approach for National Security

Introduction

The United States no longer has the manufacturing capability or access to materials needed for continued economic growth and prosperity for our people. The United States is entering a period of increased national security risk due to lack of access to specific goods and products. One specific industrial sector—microelectronics—is emblematic of the issue. A similar argument could be posed concerning other sectors, like pharmaceuticals and certain raw minerals. But technologies that underpin the development of microelectronics, to include transistors, computers, digital programming, and others,1 were transformative technologies in which the United States dominated throughout the 20th century. The United States was able to both develop and manufacture products that sprang from them, and to dominate in microelectronics design and manufacture. As the global economy became more entrenched in the 21st century, manufacture and accessibility moved from the United States to other nations. This has led to a situation where both economic and national security is vulnerable due to supply chains that extend to global competitors. Using the semiconductor industry as the example of where supply chains have created vulnerabilities, we call for a new approach to national security by ensuring that critical industries can provide assured access.

Read more: A Microelectronic “Canary in a Coal Mine”

Securing Critical Supply Chains

Strategies for Sovereignty Over Critical Supplies

During times of crisis, such as the COVID-19 pandemic, the significance of securing critical supply chains to uphold national security becomes evident. How can the United States maintain sovereignty and protect its interests when our economy and national security are dependent on external, global supplies of services and products?

We discuss three strategies that the United States can adopt to maintain full sovereignty over critical supply chains:

  1. Fully US Controlled Critical Supply Chain
  2. MAD-1: Mutually Assured Dependence
  3. MAD-2: Mutually Assured Destruction

Read more: Securing Critical Supply Chains

Authentication Using Biometrics: How to Prove Who You Are

It is increasingly important to be able to prove that you are who you say you are. Logging into a computer, operating an ATM, voting, and making purchases on credit all require authentication. The field of biometrics studies anatomical, physiological, and behavioral attributes of humans that can be used to distinguish one person from others. Historically, modalities like fingerprints have been used to uniquely identify a person. Biometric measures can be used to authenticate a person in place of less secure methods like employing badges or passwords, and thus have much appeal for practical application. As a result, the academic field of biometrics continues to spawn commercial endeavors. This paper surveys some of the promising biometric measures and considers prospects for employing DNA-based authentication methods in the future.

Introduction

It is hard to prove that you are who you say you are.
You have a name, and so you can tell people your name. But someone else could impersonate you by using the same name. What do you do if you have to prove that you are the person that you say you are?

Of course, we must prove it all the time. We sign documents, we provide passwords to log in, and we present photo IDs. Sometimes we are required to provide our social security number and date of birth, as though only we would know that information. Notaries check our government-issued picture IDs, as do the TSA officials at the airport. Increasingly, voting locales require some form of identification. Physical possession of a smartphone also acts as a personal identifier. Now, we can make purchases based on possession of our personal cell phone.

None of these methods of authentication are fool proof. For example, signatures morph over time and are forgeable. Databases of identification numbers are stolen. Passwords are hacked. Cell phones are stolen and unlocked. A determined impersonator can defeat any of these authentication approaches.

Identity fraud and identity theft are increasingly serious problems costing tens of billions of dollars per year in the US alone. All interactions with government, with financial institutions, and most interactions with businesses involve authentication as proof of identity. It is fundamental to our workings as a civilized society. The election security debate is mostly about trust in authentication. Technology, however, can provide solutions.

An unacceptable solution is to install a chip into every human upon birth. In lieu of this distasteful solution, society is increasingly turning to technology and employing biometrics to authenticate a person. Biometrics are unique physiological and behavioral attributes that can be used to identify individuals. These characteristics are individualized, relatively fixed, and recordable. Typically, they are also hard to forge. In what follows, we discuss the emerging possibilities for automated biometric authentication.

Ultimately, the most unique and immutable property of each individual is their DNA sequence. (Of course, identical twins have the same DNA sequence, but there are other markers to distinguish them.) By identifying an appropriate number of specific markers that vary across the population, but uniquely identify a particular individual, it should be possible to biochemically authenticate a person. With advances in biotechnology, we foresee a time when signatures can be replaced with fast and efficient biochemical tests.

Authentication of an individual is only one aspect of a broader set of applications of identity management. Biometrics can be used to identify a single person in a crowd or to label each person presented to a system. Authentication refers to a specific case, where a person is either an impersonator or not. Impersonation will be uncommon, but for many applications it is important that impersonators are deterred or caught.

Read more: Authentication Using Biometrics: How to Prove Who You Are

Can Humans Think?

In 1950, Alan Turing famously asked the question, “Can Machines Think?”

His seminal paper, “Computing Machinery and Intelligence,” led to the introduction of the field of Artificial Intelligence (AI). Alan Turing did not answer his own question, although he speculated that by the year 2000, machines would have passed his test for what he believed would constitute thinking, which became known as “the Turing test.” But can a machine really think, or is it somehow artificial? If a machine can convince humans that it can think, then can humans really think?

Read more: Can Humans Think?

Autonomous Vehicles: What’s the Deal?

Autonomous vehicle technology promises to make driverless vehicles a reality. Yet the introduction of commercial driverless vehicles has been delayed, and there are warning signals that perhaps the technology will not be ready any time soon. We list some of the warnings, successes to date, and challenges to their introduction and integration into the transportation enterprise. We note some particular special cases where introduction might be possible in the short term. One difficulty is that the development of autonomous vehicle technology is proceeding under the assumption that the infrastructure (the roads, and other vehicles) will provide minimal assistance. We note that government could accelerate the development by providing standards, sensors, and communications as part of the infrastructure as improvements are made to roads and bridges.

Read more: Autonomous Vehicles: What’s the Deal?

Re-Embrace American Science and Technology Reimagine, Reinvent, Restart

America must invest in bold, imaginative, and inspirational endeavors to tackle the hardest challenges facing the world–challenges which may only be overcome through inspired scientific research and inventive technological development. As Americans begin emerging from the pandemic’s long shadow, we look to the future and find ourselves at a unique crossroads. Congress and the Biden administration are considering massive infrastructure investments, economic stimuli, and funding for science and technology—programs on a scale not seen in nearly 100 years. The dramatic scale of these programs necessitates that we ask ourselves the following: How can we best leverage these investments to promote American interests, retain America’s leadership of the science and technology enterprise, and ensure the nation’s safety, security, and prosperity for years to come? The answer is to re-embrace American science and technology.

Read more: Re-Embrace American Science and Technology Reimagine, Reinvent, Restart

Synthetic Biology

Introduction

The term “synthetic biology” was coined over a century ago. Since then, synthetic biology has grown into a diverse, multidisciplinary field that leverages tools, techniques, and ideas from biology, chemistry, engineering, computer science, medicine, bioinformatics, and many other fields. Closely allied to bioengineering, synthetic biology aims to create new biological elements or redesign existing processes found in nature. Recent scientific breakthroughs and commercial tools have ushered in new advancements and opportunities. The development of these tools and the maturation of synthetic biology as an academic field has led to the discovery of applications and the formation of companies.

The current field of synthetic biology is the culmination of decades of scientific breakthroughs and technological advancement. From the discovery of DNA and its function, to uncovering its double-helix structure, unraveling the genetic code, constructing a reference human genome, economizing genetic sequencing, and the more recent developments in genome editing, these basic research discoveries have enabled innumerable applications. In his recent book, author Walter Isaacson describes the CRISPR-Cas-91 “genetic scissors” discovered by 2020 Nobelists Jennifer Doudna and Emmanuelle Charpentier as transforming “the future of the human race”2 and flags the many thorny ethical issues this transformative technology brings to the fore. Most notable of these are the issues surrounding germline editing, which would alter the inheritable human genome.3 However, these concerns, though extremely serious and justified, should not forestall the use of synthetic biology in its many other applications.

Depending on what one includes in the breadth of synthetic biology, market potentials for applications (in annual US dollar amounts) are estimated to be in the tens of billions for new applications over the next few years.4 Venture capital funds5 and foundations6 have been established to support and accelerate developments. Dozens of early-stage start-up companies have been formed in the US, while at the same time basic research continues in academic and corporate research laboratories to provide greater understanding of the opportunities afforded by synthetic biology. Bringing applications to fruition and commercializing synthetic biology products, however, will require considerable work and technological expertise.

It is clear that the field is in its infancy, and thus the full range of applications remains largely unexplored. Many applications, as yet unimagined, might be possible as the field expands. For example, it is possible to modify DNA to admit replacements for one or more of the four nucleotides,7 or to expand the number and kind of nucleotides.8 One such experiment used eight different nucleotides. Such an expanded genetic alphabet will surely allow researchers to explore new possibilities and may result in new proteins and polynucleotides with previously unattainable, or even unimaginable, properties and functionalities. While some researchers work to build new biological systems and capabilities from the ground up, others are starting with the nearest facsimile nature already possesses and then through directed evolution teaching that protein new, alternative functions. This allows for biology’s catalysts (enzymes) to do more of the work that was once done by teams of human chemists.

In this article, we discuss the current activities in and progress across a number of synthetic biology application areas. In all cases, however, much more development can be expected—leading to a range of new products, many not envisioned here. As the academic field continues to organize and expand, the range of applications for exploration will likewise expand, through startups and industries. Our focus here is on what is known now about applications and their logical extensions.

Read more: Synthetic Biology

©, 2016-2021, Potomac Institute for Policy Studies, All rights reserved.

Most Read

From the CEO

Potomac Institute For Policy Studies CEO

ISSN

STEPS (Online) ISSN: 2333-3200

STEPS (Print) ISSN: 2333-3219

STEPS Archives

stepsCoversm