Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (2024)

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks ()

1. Introduction

Hypotheses have a very important place and role in fundamental research [1]. In science and technology, everything moves around the hypothesis [2]. It is the quintessence of the scientific research context, and its role has a central place and a special meaning in science. The hypothesis often becomes a basis for defining the next steps and dictates and leads the whole process of scientific research. Although the methods of hypothesis-generating research are less rigorous, they do not replace or undermine more rigorous hypothesis-testing or hypothesis-proving research methodologies. Building a hypothesis is important in building new paradigms that lay the foundation for discoveries. Except for a few rare, serendipitous inventions, almost all discoveries the world has ever seen begin with a HYPOTHESIS. Whether a hypothesis is eventually proven or disproven, it never loses its importance as the beginning of a journey to new knowledge. Historically, hypothesis-generating research has facilitated inventions that may not have been possible otherwise [3]. This report isn’t hypothesis-testing or hypothesis-proving research designed to empirically answer a known research question [4]. It is an analysis that builds the hypothesis and formulates a research question that researchers can design their studies to answer.

1.1. Research Background and Research Methodology

This research adopts a narrative and integrative literature review approach, suitable for an entirely new subject matter that needs further exploration [5]. The exponential simultaneous growth of quantum computers (QC) and telecommunication networks offers new opportunities and problems. Through this review, the study aims to generate new perspectives on the security implications of QC on the projected technical specifications of 6G networks.

Formulating the research question is the first step in the research process and provides the foundation for framing the hypothesis. The research question should be feasible, interesting, novel, ethical, and relevant (FINER). Applying the FINER criteria can assist with ensuring that the question is valid and will generate new knowledge that has a global impact [6]. One such research question finds two next-generation technologies in conflict, substantially impacting the future of smart cities. Those technologies are quantum computers (QC) and 6G networks, which are projected to premiere in 2030. The development of these technologies happens in tandem with the development of smart cities, which are built on the principle of increased connectivity, networkability, and computing speed of the digital infrastructure. Quantum computers have shown that they can process certain tasks exponentially faster than classical computers. In late 2019, Google claimed it solved a problem that would take 10,000 years for the world’s fastest supercomputer using a QC within 3 minutes [7]. Quantum computers are so powerful that they can cause havoc with encryption. The public key-based cryptographic algorithms and Elliptic Curve Cryptography (ECC) certificate protocols behind many currently used cryptographic schemes can be broken using QC [8], posing an existential threat to humanity [9]. Quantum computers also render the 6G networks vulnerable [10]. While many cybersecurity experts warn about the threat of “harvest now and decrypt later” (HANDL) attacks, few attest that they’re already happening. Cybercriminals may already be hoarding data for when QC becomes powerful enough to break current encryption standards [11]. Data is projected as the new fuel for the 21st century. It must be produced, stored, and transmitted securely and efficiently. 6G will soon be the backbone of our future societies [12]. Any vulnerability to the 6G networks needs to be urgently addressed.

This research identifies a potential catch-22 situation in developing 6G networks and generates two research questions to build and support a hypothesis. Section 1.2 articulates a problem statement that the state-of-the-art needs to resolve to mitigate the 6G security problem originating from the introduction of quantum computing. Section 1.3 presents an analysis of the problem. Section 1.4 describes the purpose of this research in the form of research questions that this paper attempts to answer and briefly presents a concise summary of the related work. In Section 1.2, the state-of-the-art is challenged with the research questions that set the backdrop for this paper. Section 2 frames the hypothesis that eliminates the complexities ingrained in legacy systems to build support for the quantum-safe hypothesis on the future of 6G security. Sections 2.1 and 2.2 review the AZT (Absolute Zero Trust) approach and its impact on the efficiency of the 6G network. Section 3 lays down the limitations of this study. Section 4 discusses the possible future if this research meets its goal and opens up a debate amongst 6G researchers for testing and proving the hypothesis.

1.2. Problem Statement: The Catch-22 Situation

Research on 6G networks currently faces a catch-22 situation, perhaps not envisaged when the 6G targeted parameter goals were planned [12]-[14]. However, as we approach the QC era, the security threats to 6G networks from QC have become real [11], resulting in new challenges in achieving at least three of the eight 6G goals, as illustrated in Figure 1 (highlighted in red-colored circles) and listed herein:

1) 1000 times lower cost compared to 5G,

2) Reliability Resilience, Security on account,

3) Very low latency.

1.3. Problem Analysis

6G networks face diverse challenges, such as resource-constrained mobile devices, difficult wireless resource management, high complexity of heterogeneous network architectures, explosive computing and storage requirements, and privacy and security threats. 6G is targeted as a global communication facility with approximately 1 Tb/s user bit rate and less than 1 microsecond latency [13]. Zhang et al argue that 1000 times price reduction from the customer’s viewpoint is the key to the success of 6G [14] (Figure 1). The simultaneous advent of QC and 6G compels the upgradation of network security as the powerful QC will be able to break the current encryption standards. Given the projected arrival of fully functional QC by 2030 and the projected timeline for the 6G launch, there’s an urgent need to bolster defenses against Q-Day threats to 6G networks. Communication security experts aggressively pursue Post Quantum Cryptography (PQC), so NIST (National Institute of Standards and Technology) is standardizing the PQC algorithms to get them production-ready. However, PQC development faces two types of obstacles.

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (1)

Figure 1. Impact of quantum computing on the projected goals of 6G networks.

1) Firstly, none of the 82 candidate PQCs in NIST’s standardization initiatives so far have proven to be unbreachable [15].

2) Secondly, the PQC deployment will make the 6G goals almost unachievable as most PQC algorithms rely on keys much larger than those in classical algorithms and will likely have a higher computational cost than the current RSA methods. These large keys consume more storage space and processing power, increasing the time and costs of their implementation. There are substantial storage and computational costs and latency implications of PQC depending on the length of the keys ciphertext and signature size, the computational efficiency of their encryption, encapsulation, signature verification and private key decryption, decapsulation, and signing operations.

Not counting its operational cost or energy efficiency, a recent high-performance implementation of CRYSTALS-Dilithium achieved the best-known latency as low as 16.8 microseconds on an Artix-7 at 142 MHz chip [16]. This is manifold higher than the 1-microsecond target set for 6G networks. Moreover, PQCs are computationally expensive [17] and a likely obstacle to the desired 1000 times price reduction [14].

If the current obstacles to PQC standardization are overcome and PQC succeeds in NIST’s standardization process, the critical challenge of latency and cost containment looms large over 6G networks. These networks demand ultra-low latency (beyond current PQC capabilities) to power real-time applications seamlessly. Therefore, cybersecurity solutions that offer blazing-fast, low-latency performance are undeniably crucial. Regardless of the fate of PQC algorithms in the NIST standardization initiative, the urgency of such solutions remains paramount in fortifying future 6G networks against quantum threats.

1.4. Purpose of the Research and Related Work

The existing systems cannot handle the ever-growing latency and connectivity needs of the Internet of Everything. 6G communication networks are expected to provide global coverage, enhanced spectral/energy/cost efficiency, better intelligence level and security, etc. To meet these requirements, 6G networks will rely on several new enabling technologies, i.e., air interface and transmission technologies and novel network architecture, such as waveform design, multiple access, channel coding schemes, multi-antenna technologies, network slicing, cell-free architecture, and cloud/fog/edge computing [18]. To build an intelligent and open 6G network, each node should have sufficiently low-latency communication and computing resources to support low-cost, self-evolving intelligent operations. The data rate will triple in 6G, be fifty times quicker than the quickest 5G network with a tenth of the latency, support the range of devices ten times, and be more reliable. 6G will be able to connect everything, integrate different technologies and applications, support holographic, haptic, space, and underwater communications, and it will also support the Internet of Things. The future networks’ design is anticipated to balance technological innovation, economic and environmental sustainability, and human-centric values.

In next-generation networks, everything will be fully connected, fulfilling the requirements of ubiquitous connectivity over wireless networks. This rapid growth will transform the world of communication with more intelligent and sophisticated services and devices, leading to new technologies operating over very high frequencies and broader bands. To achieve the objectives of the 6G networks, several key technology enablers need to be performed, including massive MIMO (Multiple-Input Multiple-Output), software-defined networking, network function virtualization, vehicular to everything, mobile edge computing, network slicing, terahertz, visible light communication, virtualization of the network infrastructure, and intelligent communication environment [19]. To achieve those goals, 6G will have several new paradigm shifts. Security is the most challenging of them. Early 6G researchers often ignored the impending threats to this encryption-dependent communication network from Quantum Computers (QC), which are becoming more real with each passing day. The situation will worsen when QC with sufficient qubits arrives to break current encryption algorithms. The Cloud Security Alliance launched a countdown to Y2Q (years to quantum) that predicts just under six years until QC can crack current encryption [20]. They pick April 14, 2030, as the deadline by which the world must upgrade its IT infrastructure to meet the Y2Q threat (Figure 2). Even NATO and the White House recognize the threat and are preparing for Y2Q [21]. In April 2021, the Ransomware Task Force, a group of industry experts, submitted a report entitled “Combatting RansomwareA Comprehensive Framework for Action” to the US government [21]. On May 12, 2021, in response to this report, President Biden [22] issued an Executive Order entitled “Improving the Nations Cybersecurity,” which requires that the US advance towards a “Zero Trust Architecture,” as described by the NIST (National Institute of Standards & Technology) [23]. However, even the standard Zero Trust architecture, which remains policy-based, may not be enough for autonomous networks like the 6G [13].

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (2)

Figure 2. Countdown to Q-Day (Y2Q). Credit: Cloud Security Alliance [20].

QC wields unprecedented computing power, posing a formidable threat to future 6G infrastructure. Their ability to break traditional encryption could compromise the security of sensitive data transmitted over 6G networks. As QC seems so close to becoming a reality, any cybersecurity strategy ignoring QC may be short-sighted. QC will never replace classical computers for real-world general-purpose tasks, nor are they intended to do so. QC will become integral to high-performance computing (HPC) for specialized use cases only for various important scientific tasks [24].

To safeguard the future of communication, quantum-resistant security standards are imperative. The global response to the impending Q-Day threat is evident in initiatives such as NIST’s program for developing quantum-safe encryption standards, which began in November 2017 with the submission of 82 candidates for post-quantum cryptography (PQC) algorithms. PQC is being aggressively developed to secure our cryptography-dependent digital infrastructure in a Zero Trust (ZT) cloud computing continuum recommended by NIST [25]. In 2019, the results of its first round of 82 PQC candidates entering the standardization process were published [26]. In 2022, two of the four finalist PQC candidates were demolished by ethical hackers using standard computing devices, sending a shockwave within the cybersecurity community. Last year, a Swedish group [27] and a French team of cryptographers cracked the remaining finalist PQCs (CRYSTALS-Kyber and CRYSTALS-Dilithium) [28]. PQCs [29], particularly the NIST finalist, Kyber [30], remain vulnerable to side-channel attacks. With all the PQC candidates failing, NIST’s standardization process is seriously jeopardized. QC indeed appears more detrimental to human interests than the benefits it delivers [31]. A solution is therefore urgently needed.

PQC is the only defense currently explored by researchers and regulatory authorities to secure the Internet from the Q-Day threat. Although computer security heavily relies on cryptography, recent evidence indicates it can transcend beyond encryption by deploying ZVC (Zero Vulnerability Computing) technology [32]. A series of recent reports disclose a novel way to deal with the impending Q-Day threat by segregating all QC activities from mainstream Internet instead of deploying resource-intensive PQC on every Internet device [15] [32] [33]. It deployed a new Zero Vulnerability Computing (ZVC) paradigm that proposed a new computer architecture banning all third-party permissions to reduce the computer’s attack surface to zero and achieve zero vulnerability [15] [32]-[35]. This approach delivers QC services to customers in a Quantum-as-a-Service (QaaS) business model [33] [34]. ZVC is an encryption-agnostic approach that can potentially render computers quantum-resistant by banning all third-party permissions, a root cause of most vulnerabilities. A 6G security approach can potentially expound on the principal objective of designing a ZVC computing environment that eliminates the complexities of the traditional multi-layered architecture of legacy computing devices and builds a minimalist, compact Solid-State Software on a Chip (3SoC) device that’s robust, resilient, energy efficient, and with zero attack surface, rendering it resistant to malware, as well as future Q-Day threats [15] [32]-[35].

2. Framing the Hypothesis

This is hypothesis-generating research designed to generate and formulate a new research question. It is not a hypothesis-testing or hypothesis-proving research designed to empirically answer an already known research question [1]-[4]. The methods of hypothesis-generating research are less rigorous and do not replace or undermine more rigorous hypothesis-testing or hypothesis-proving research methodologies. Nevertheless, they are important in building new paradigms that lay the foundation for discoveries. Almost all discoveries begin with a hypothesis. Whether a hypothesis is eventually proven or disproven, it never loses its importance as the beginning of a journey to new knowledge that would not have been possible otherwise [3].

2.1. Generating the Exploratory Research Questions

Based on the latest peer-reviewed evidence on QC and its impact on 6G security via resource-intensive post quantum cryptography (PQC), the following research question is reasonable to explore:

Will the advent of quantum computers make the latency and pricing goals of 6G networks unachievable?

An affirmative answer to this question leads to exploring an approach to mitigate the adverse impacts of QC on 6G, which leads to a second research question:

Can Absolute Zero Trust (AZT) security architecture deliver autonomous quantum-safe security to 6G networks to guarantee its latency and price reduction goals?

In a structured research methodology, a hypothesis must be formulated, supported, tested, and validated to answer a research question unequivocally. Hence, a hypothesis is formulated and framed to answer the second question and render the concept a technological reality.

2.2. Formulating and Supporting the Hypothesis

Hypothesis: Absolute Zero Trust security architecture delivers autonomous quantum-safe security to 6G networks, guaranteeing their latency and price reduction goals.

In 2020, NIST defined Zero Trust as “a term for an evolving set of cybersecurity paradigms that move defenses from traditional static, network-based perimeters to focus on users, assets, and resources [36].” Zero Trust Architecture is proposed for securing 6G networks [37]. However, as illustrated in Figure 3, all Zero Trust initiatives are policy-based and cannot be autonomous unless empowered by AI [38] [39]. 6G is not just a communication technology but a backbone of all of a smart city’s heterogeneous computing needs. It enables the future metaverse with sensors and IoT devices that need continuous autonomous control. Therefore, 6G networks must be autonomous. Hence, Zero Trust Artificial Intelligence is considered an essential component of 6G [40].

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (3)

Figure 3. PQC implemented via standard Zero Trust architecture (policy based) increases the cost and latency of the 6G network.

A recent report discloses a seamless and autonomous absolute zero trust (AZT) security framework that runs continuously and autonomously without monitoring the network [15] [41]. Such AZT is encryption agnostic and, therefore, quantum resistant. It is also light, energy-efficient, fast, and low-cost (Figure 4) as it does not rely on resource-intensive PQC.

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (4)

Figure 4. 6G network secured via AZT is autonomous, fast, energy-efficient, and low-cost.

It is projected that by 2030, when 6G and QC are expected to be deployed, more than 100 trillion sensors will be manufactured and connected to the Internet [42]. Specifically, by examining previous generations of wireless communications, experts predict that 6G networks will offer a wireless connection for less than 0.1 US dollars/year, a 1000 times price reduction compared with conventional 5G systems. Hence, Zhang et al. concluded that a 1000-times price reduction, approaching 0.1 US dollars/year per connection for 6G, will be necessary to maintain the sustainable development of the smart society [12]. All the evidence suggests that QC is experiencing an inflection point [15] [34] [43] [44], compelling us to prepare for this new computing paradigm. Recent setbacks may jeopardize the original NIST timeline for PQC standardization, which was originally estimated at 15 years (2034) in 2019 for a full transition to a Quantum-safe Internet (Figure 5). Moreover, global PQC implementation is a massive undertaking that impacts each computing device in the entire Internet ecosystem. It is not just time-consuming but resource-intensive and expensive. By prioritizing developing and implementing cybersecurity solutions optimized for minimal latency, stakeholders can ensure the robustness and efficiency of 6G networks amidst the ever-evolving landscape of quantum technologies.

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (5)

Figure 5. AZT Timeline in comparison to PQC and 6G. Adapted from Raheman F. Tackling the existential threat from quantum computers and AI [45].

3. Limitation & Caveats

This is not a hypothesis testing/validating research, which most peer-reviewed literature is, and should be. However, that does not underplay the importance of hypothesis-building research, on which almost all peer-review piggybacks for building the foundation of new knowledge [1]-[4]. You can’t comprehensively test or prove a concept without methodologically framing and building a hypothesis for a new concept. More so when the subject matter is as hot and time-sensitive as Quantum Computing and 6G. This hypothesis-building research explicitly emphasizes the conceptual stage of the research and does not claim to be an improvement on an earlier report. The principal objective of this research is to make the hypothesis available to a broader community of 6G researchers for testing and proving or disproving it to help shape the future of 6G. This report is clearly no more than hypothesis-generating research intended to build and formulate a hypothesis that researchers worldwide can design and investigate experiments to test and prove or disprove in the near future. Until such studies are conducted, great care should be taken to extrapolate the findings of this report to real-world settings [45]. Such studies begin by defining the specific research questions. What the process will be. What protocols will be designed to implement the process? What KPIs will be appropriate to evaluate and control the protocols [45]? Those and many other questions come to mind when planning the future of our 6G-powered digital infrastructure.

4. Conclusions

6G communication networks are envisioned to nurture the future of a ubiquitously connected data-intensive intelligent ecosystem powered by the complete automation of wireless networks spread across the ground, underwater, air, and space. Moreover, 6G is also envisaged to deal with the explosive growth in mobile traffic, which is estimated to be several 100 billion gigabytes (GB)/month by 2025 and trillions of gigabytes (GB)/month by 2030 [46] for emerging data-intensive speculative applications [47] within metaverse, AI and autonomous mobility space [48]. To serve these future applications better by seamlessly interconnecting a staggering number of heterogeneous devices, the next generation of mobile networks are, by and large, expected to be inherently softwarized, virtualized and cloudified systems [49]. Quantum computers slated to premiere around the same time in 2030, will adversely impact the security of such softwarization, virtualization, and cloudification of mobile networks, making it challenging to meet the projected 6G parameter goals.

This report contributes to the discourse on securing 6G networks without compromising their latency and pricing goals. The principal objective of this research is to identify the most serious pain points that 6G development is facing on account of QC, which is currently perceived as an existential risk to humanity by many experts. The research conducted a thorough literature review to generate a clearly articulated hypothesis that provides a credible path to mitigating the 6G pain points in a timely manner.

The empirical evidence in peer-reviewed literature provided enough basis to support the above hypothesis and afford sufficient motivation for 6G and QC researchers to further research to test and prove the hypothesis. This work introduces new ideas, new thinking, and a new understanding of network security, which can be helpful to researchers, thinkers, 6G and cybersecurity developers, regulators, and practitioners working to secure the future Internet.

Future research may be directed toward testing and proving the hypothesis to achieve adequate security of 6G, maintaining its latency and pricing goals. The concept enshrined in the hypothesis may invite more academic interest in peer review or grant writing, but they may not be too conducive to making any immediate real-world business decisions. They may face technological challenges until the concepts get rooted in our empirical multidisciplinary research methods. This article intends to spark and encourage further, in-depth discussion around these topics as we owe the researchers and policymakers a clear vision of the future. As cyber threats to our digital infrastructures continue evolving, such research is crucial in ensuring the integrity and confidentiality of information across the Internet, reinforcing the overall cybersecurity landscape of our digital infrastructures.

Acknowledgements

The author is grateful to Tejas Bhagat and Sadiya Khan for their assistance in preparing this manuscript.

Formulating and Supporting a Hypothesis to Address a Catch-22 Situation in 6G Communication Networks (2024)
Top Articles
Latest Posts
Article information

Author: Pres. Lawanda Wiegand

Last Updated:

Views: 6382

Rating: 4 / 5 (71 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Pres. Lawanda Wiegand

Birthday: 1993-01-10

Address: Suite 391 6963 Ullrich Shore, Bellefort, WI 01350-7893

Phone: +6806610432415

Job: Dynamic Manufacturing Assistant

Hobby: amateur radio, Taekwondo, Wood carving, Parkour, Skateboarding, Running, Rafting

Introduction: My name is Pres. Lawanda Wiegand, I am a inquisitive, helpful, glamorous, cheerful, open, clever, innocent person who loves writing and wants to share my knowledge and understanding with you.