Incumbent vendors cannot ignore the market-disrupting implications of AT&T’s and Swisscom’s outsourced 5G mobile cores

At the tail-end of June, both AT&T and Swisscom announced plans to outsource their 5G mobile core networks to cloud service providers. AT&T’s announcement was with Microsoft Azure. Swisscom’s was with AWS. With the AT&T deal, Azure will be acquiring IP and engineering expertise from AT&T that it can resell to other operators. It was reported that Azure will offer jobs to current AT&T employees, as well. This move suggests Microsoft understands how critical it will be to retain talent that has deep networking knowledge. Swisscom’s deal does not appear to include AWS acquiring any Swisscom assets or employees, but the operator did say it will be using AWS for its own IT applications.

Spotlight

ISP Supplies

ISP Supplies provides complete integrated global networking solutions and technologies to its customers. Since 2009, we have implemented wired and wireless equipment designed to handle ever increasing performance requirements.

OTHER ARTICLES
Network Management, Network Security

What's New In 5G - May 2021

Article | July 17, 2023

The next-generation of wireless technologies - known as 5G - is here. Not only is it expected to offer network speeds that are up to 100 times faster than 4G LTE and reduce latency to nearly zero, it will allow networks to handle 100 times the number of connected devices, revolutionizing business and consumer connectivity and enabling the "Internet of Things." Leading policymakers - federal regulators and legislators - are making it a top priority to ensure that the wireless industry has the tools it needs to maintain U.S. leadership in commercial 5G deployments. This blog provides monthly updates on FCC actions and Congressional efforts to win the race to 5G.

Read More
5G

Key Network Performance Metrics to Improve Efficiency

Article | September 28, 2023

Discover key network performance metrics to enhance user experience. Explore in-depth latency, throughput, jitter, packet loss, VOIP quality, and MOS score to optimize network performance analysis. Contents 1. Importance of Network Performance Metrics for Performance Analysis 2. Critical Key Network Performance Metrics to Monitor 2.1 Latency 2.2 Throughput 2.3 Jitter 2.4 Packet Loss 2.5 VOIP Qualiy 2.6 MOS Score 3. Steps to Monitor and Measure Network Performance 4. Significance of Monitoring Metrics in Network Troubleshooting 4.1 Provides Network Visibility 4.2 Prevents Network Downtime 4.3 Observe Bandwidth Usage 5. Overcome Monitoring Challenges in Network Performance Metrics 6. Key Takeaway 1. Importance of Network Performance Metrics for Performance Analysis Network performance involves analyzing and evaluating network statistics to determine the quality of services provided by the underlying computer network. Considering various key network metrics, it is primarily measured from the end-users’ perspective. Measuring these metrics, analyzing performance data over time, and understanding the impact on the end-user experience is essential to assess network performance. Measuring network performance requires considering factors such as the location and timing of measurements. For instance, network performance may differ when comparing paths between cities or during periods of varying user demands throughout the day. Therefore, a comprehensive approach to monitoring network performance involves identifying these variables and identifying areas for improvement. Network performance metrics offer valuable insights into any network infrastructure and services. These metrics provide real-time information on potential issues, outages, and errors, allowing one to allocate IT resources efficiently. Understanding end-user demands can create an adaptive network to meet future business needs. However, comprehensive monitoring requires an advanced network monitoring tool to gather, analyze, and interpret data effectively, optimizing network performance. Leveraging relevant metrics can improve network performance, help make informed decisions, enhance network reliability, and deliver a superior user experience. 2. Critical Key Network Performance Metrics to Monitor 2.1 Latency Latency, or network delay, is a crucial performance metric in network monitoring and management. It quantifies the time required to transmit data between destinations. Factors like packet queuing and fiber optic cabling affect network latency. Consistent delays or sudden spikes in latency indicate significant network performance issues. Monitoring and minimizing latency are essential for ensuring optimal network performance. By actively tracking latency, organizations identify and address issues that may cause delays in data transmission, thereby improving overall network responsiveness and minimizing disruptions for end-users. 2.2 Throughput Throughput metrics for network monitoring enable measurement of the data transmission rate across various network segments. Unlike bandwidth, which represents the theoretical data transfer limit, throughput reflects the successful delivery of data packets to their destination. Variations in throughput can occur across different network areas. A low throughput indicates the presence of dropped packets requiring retransmission, and highlights potential performance issues that need attention. Monitoring throughput is crucial for effective network management. By monitoring this performance metric, organizations can gain insights into the actual data transmission rate, ensuring that it aligns with expected levels. 2.3 Jitter Jitter, a key performance metric in network monitoring, refers to the variation in delay between packets, measured as the difference between expected and actual arrival times. It results due to network congestion, routing issues, or other factors, leading to packet loss and degraded application performance. Jitter disrupts the standard sequencing of data packets and can arise due to network congestion or route changes. Monitoring jitter is crucial for identifying and addressing network stability issues and ensuring reliable data transmission. By actively monitoring this performance metric, organizations can address variations in packet delay, mitigating issues that leads to packet loss and enabling proactive troubleshooting. 2.4 Packet Loss Packet loss, a performance management network monitoring metric, represents the number of data packets lost during transmission. It directly affects end-user services, leading to unfulfilled data requests and potential disruptions. Packet loss can arise from various factors, including software problems, network congestion, or router performance issues. Monitoring the entire process precisely to detect and address packet loss, ensures reliable data transmission and optimal network performance. Monitoring packet loss with the right network monitoring software enables timely troubleshooting and optimization of network infrastructure, ultimately enhancing overall network reliability and performance. 2.5 VOIP Quality VoIP (Voice over Internet Protocol) quality is a crucial network performance metric. It refers to the overall performance of a VoIP system in delivering clear and reliable voice communications over the Internet, replacing traditional phone lines. Factors influencing VoIP quality include network bandwidth, latency, packet loss, jitter, and the quality of end-user devices. Monitoring VoIP quality ensures optimal system functionality and high-quality voice communications. Key performance indicators (KPIs) such as mean opinion score (MOS), jitter, latency, packet loss, and call completion rates are utilized to assess and optimize VoIP quality. 2.6 MOS Score Mean opinion score (MOS) is a vital performance metric in network monitoring, rating the perceived quality of a voice call on a scale of 1 to 5. It is a standardized measurement developed by the ITU, an international agency focused on enhancing communication networks. Initially designed for traditional voice calls, the MOS has been adapted to evaluate Voice over IP (VoIP) calls. The MOS score considers various factors, including the specific codec employed for the VoIP call, providing a comprehensive assessment of voice calls quality in network monitoring. 3. Steps to Monitor and Measure Network Performance Step 1: Deploy a Software for Network Monitoring To effectively measure network performance, deploying dedicated network monitoring software is crucial. While temporary tools like traceroutes and pings can provide insights into ongoing problems, they are insufficient for troubleshooting intermittent network issues. Relying on periodic tools for intermittent issues is reliant on chance, as it may only detect problems when they occur during tool usage. By implementing comprehensive network monitoring software, one can proactively monitor and analyze network metrics, historical data, and performance, allowing for timely detection and resolution of both ongoing and intermittent network issues. Step 2: Distribute Monitoring Agents For comprehensive network performance measurement, businesses must distribute monitoring agents strategically across key network locations. These specialized software agents continuously monitor network performance using synthetic traffic, simulating and assessing the end-user perspective. By distributing Monitoring Agents, organizations can: • Measure key network metrics, including jitter, packet loss, and throughput. • Identify and troubleshoot intermittent network issues that are challenging to pinpoint. • Receive alerts regarding any performance degradation, ensuring a timely response. • Collect valuable data for in-depth troubleshooting and analysis, facilitating proactive network management and optimization. Step 3: Measure Network Metrics After deploying the monitoring agents, they continuously exchange synthetic User Datagram Protocol (UDP) traffic, forming a network monitoring session. During this session, the agents measure network performance by evaluating key metrics and conducting network traffic analysis. The metrics used in the analysis include specific parameters, and the results of these measurements are presented in a network response time graph, providing a visual representation of the network's performance characteristics. Monitoring and analyzing these metrics enable organizations to gain valuable insights into network performance, facilitating informed decision-making and convenient network performance troubleshooting. 4. Significance of Monitoring Metrics in Network Troubleshooting 4.1 Provide Network Visibility Monitoring metrics plays a vital role in network troubleshooting by offering network visibility. They enable the identification of performance bottlenecks, configuration problems, and security vulnerabilities that detrimentally affects network performance. These issues can be addressed through targeted troubleshooting efforts, resulting in improved network performance and enhanced end-user experience. Organizations identify and resolve network issues by monitoring metrics, ensuring optimal network functionality and overall business productivity. 4.2 Prevent Network Downtime Effective monitoring metrics are instrumental in preventing network downtime, a costly concern for businesses. Swift identification and resolution of network issues through proactive network performance troubleshooting help minimize downtime, ensuring uninterrupted business operations. By promptly addressing potential problems, network troubleshooting safeguards against lost productivity, revenue, and customer dissatisfaction. Maintaining a proactive approach to monitoring and resolving network issues to enhance network reliability and business continuity. 4.3 Observe Bandwidth Usage Monitoring metrics are essential in network troubleshooting as they enable the observation of bandwidth usage. This allows organizations to detect abnormal or excessive utilization, pinpoint key performance issues and ensure optimal resource allocation. It allows for identifying critical bandwidth-hogging applications or network intrusions, helping experts take immediate action to mitigate risks, safeguard data, and protect the overall network integrity. Additionally, experts can optimize network performance and ensure a seamless user experience for organizations relying on efficient network infrastructure. 5. Overcome Monitoring Challenges in Network Performance Metrics Enterprises seeking to ensure optimal network performance and improve overall business operations must overcome network monitoring obstacles. Effectively monitoring, tracking, and improving network performance requires a strategic combination of skilled personnel, advanced technologies, and well-defined strategies. Failing to address these requirements results in various challenges that hinder the ability to enhance network performance effectively. The challenges that businesses often encounter include managing scalability, handling massive data volumes, achieving real-time monitoring, dealing with multi-vendor environments, addressing network security and privacy concerns, and adapting to evolving network demands. Each obstacle presents unique complexities that require tailored approaches and expert insights. To overcome these challenges, enterprises must invest in comprehensive monitoring tools capable of handling the scalability demands of growing networks. These tools should provide real-time network visibility, robust analytics capabilities, and intelligent data filtering mechanisms to extract meaningful insights from vast network data. Establishing clear monitoring objectives aligned with business goals and defining key performance indicators (KPIs) are essential in effectively addressing network performance challenges. 6. Key Takeaway Monitoring network performance metrics is crucial for assessing the quality of services a computer network provides from an end-user perspective. It involves continuously tracking and analyzing key metrics such as latency, throughput, jitter, packet loss, VOIP quality, and MOS score. Organizations can actively monitor and assess performance, proactively identify intermittent issues, and collect valuable data for in-depth analysis by implementing dedicated network monitoring software and strategically deploying monitoring agents across the network. In addition, it is imperative to emphasize the significance of monitoring metrics in mitigating the potential financial impact of network downtime, enhancing the utilization of available bandwidth resources, and efficiently tackling the complexities inherent in scaling operations, real-time monitoring, diverse vendor ecosystems, security concerns, and the ever-evolving requirements of modern networks.

Read More
Unified Communications, Network Security

SA and NSA: The Difference Between 5G Architectures

Article | July 10, 2023

Choosing the right 5G architecture is crucial for enhancing operations while keeping scalability and budget in mind. Learn whether SA or NSA is more suitable for your business needs with this article. 1. Introduction to 5G Network Architectures 2. What is 5G SA? 2.1 Characteristics of SA Architecture 2.2 Benefits of SA Architecture 3. What is 5G NSA? 3.1 Characteristics of NSA Architecture 3.2 Benefits of NSA Architecture 4. Factors to Consider When Choosing Between SA and NSA 4.1 Cost Implications of Each Architecture 4.2 Future Implications of Each Architecture 5. Conclusion 1. Introduction to 5G Network Architectures Widespread implementation of 5G is transforming how businesses across verticals operate, providing enhanced speed, low latency, and massive connectivity. The advancements in 5G system architecture enable new use cases, from autonomous vehicles to smart cities. There are currently two types of 5G network architecture, namely 5G standalone (5G SA) and 5G non-standalone (5G NSA). These two architectures differ in how they connect to the existing 4G infrastructure, the type of equipment required, and the level of network independence. Therefore, understanding the difference between SA and NSA is crucial for companies and organizations implementing 5G architecture. 2. What is 5G SA? 5G SA architecture is an entirely new technology that uses 5G core network architecture, independent of the current 4G LTE network. It has various use cases, such as combining 5G with AI and edge use cases. 2.1 Characteristics of SA Architecture Independent Network: All components of the architecture, including the 5G core architecture, radio access network, and user equipment, are not reliant on any 4G technology. High Performance: 5G SA architecture is optimized for high performance and low latency, enabling fast data transfer rates and near-instantaneous response times. Distributed Architecture: This allows efficient resource allocation and dynamic management of network resources. End-to-End Encryption: It provides end-to-end encryption, which ensures that data is secure and protected from unauthorized access. Higher Cost: 5G SA architecture is more expensive to implement than NSA architecture due to the need for a fully independent 5G network infrastructure. 2.2 Benefits of SA Architecture Low Latency: Applications of 5G that require real-time processing are only possible with SA architecture. Customization: As SA does not depend on existing network architecture, it can be tailored to company requirements. It also enables network slicing for 5G enterprise private network use cases. Security: End-to-end encryptions ensure a more secure network, and 5G network slicing keeps various access levels separate. Scalability: 5G architecture is designed to be highly scalable and handle large volumes of data and devices. Future-proofing: SA architecture will be able to support upcoming 5G features and capabilities by design. 3. What is 5G NSA? 5G NSA provides a transition into 'true' 5G architecture by incorporating 4G network infrastructure for deployment. 3.1 Characteristics of NSA Architecture Non-Independent Network: 5G NSA architecture is designed to leverage the existing 4G infrastructure to deliver 5G services. Transition to SA: NSA offers lower latencies and faster speeds than 4G LTE without deploying 5G architecture. Integrated Deployment: 5G NSA can be deployed quickly since it integrates existing infrastructure. Limited Scalability: As it relies on the existing 4G infrastructure, NSA is limited in scaling. Low Scalability: There is a lower limit on how many devices can join the network and the data volume that can be processed on NSA. 3.2 Benefits of NSA Architecture Faster Deployment: 5G NSA architecture can be deployed more rapidly than SA architecture. Easier Integration: 4G integration with existing networks is easier since it uses architecture. Cost-effective: 5G NSA architecture is generally less expensive to implement as it doesn't require a complete overhaul of the existing infrastructure to a 5G core architecture. Improvement Over 4G: While not providing the speed and low latency of 'true' 5G, NSA offers significant improvements over 4G networks. 4. Factors to Consider When Choosing Between SA and NSA 4.1 Cost Implications of Each Architecture SA architecture requires a complete overhaul of the existing infrastructure, which can result in higher infrastructure and deployment costs. However, SA architecture can be more cost-effective in the long run due to its future-proof design and ability to provide greater scalability and customization. On the other hand, NSA architecture leverages the existing 4G infrastructure, resulting in lower infrastructure and deployment costs. However, upgrading and maintaining an existing 4G network to support 5G technology can be complex and may result in higher operational costs in the long run. 4.2 Future Implications of Each Architecture SA architecture is designed to be future-proof and scalable, supporting upcoming 5G features and capabilities. This can give organizations greater flexibility and agility to respond to changing business needs and emerging technologies. On the other hand, NSA architecture may be less future-proof and require additional investments in infrastructure and resources to support new 5G features and capabilities. 5. Conclusion While NSA architecture may offer lower upfront costs and a faster deployment timeline, SA architecture may be more future-proof and scalable in the long run. Choosing the appropriate 5G architecture is a critical determinant for organizations aiming to utilize 5G technology in building a connected industry of the future. Organizations must evaluate their requirements and consider each architecture's short and long-term costs and operational implications before making a decision.

Read More

Edge Computing and the Future of the Data Center

Article | September 16, 2021

If you are clued into IT, then most likely, you are aware of the latest trending technology, edge computing data centers. Edge Computing ensures exceptional speed, with firm privacy and security compared to the conventional cloud methods, thus making edge data centers an imperative option for everyone. The world is undoubtedly moving faster, thereby perpetually pushing the power of next-generation innovation. Edge computing data center has emerged as a substitute to cloud computing, that keeps the data processing power at the “edge” of the network. But, it also comes with a set of challenges to the network. Edge computing devices that have processing functions are expensive and to operate the older version, additional equipment is required, which incurs extra expenditure. Despite the challenges, edge computing has turned out to be the biggest technology investment. So, let’s break it down here with comprehensive details to understand how this latest trending technology is all set to shape the future of the data center. A Brief on Edge Computing The word edge refers to the literal geographic distribution that brings computation and data storage nearer to the data sources. It improves the response duration and saves bandwidth as it runs fewer processes in the cloud and shifts those processes to local destinations such as on a user’s computer, an edge server, or an IoT for that matter. In a nutshell, edge computing is a topology that enables data to be analyzed, processed, and transferred at the edge of a network, It helps diminish the long-distance communication that takes place between a client and server. A significant advantage of using edge computing lies in its high speed and better reliability. In addition, it offers improved security by distributing processing, storage, and applications across wide-ranging devices and data centers. What’s more, it paves the way for a budget-friendly route to scalability as well as versatility, enabling organizations to expand their computing capabilities through an amalgamation of IoT devices and edge computing data centers. Edge Data Centers and Their Usage! There isn’t any specific explanation that would describe the idea of an edge data center, considering it isn’t one consistent style of the facility. It instead consists of smaller facilities that serve both edge computing and larger-scale cloud services. Since they are located closer to the population, they could easily extend the edge of the network to deliver cloud computing resources and cached content to end-users. Typically, they connect to a larger central data center or multiple computer data centers seamlessly. Latency has forever been a matter of concern for cloud data center managers. In recent times, it has emerged as a key obstacle due to big data, the Internet of Things, cloud and streaming services, and other technology trends. Moreover, in today’s time and age, end-users and devices demand access to applications and services anytime and anywhere, which leaves no room for latency. Consequently, companies across the spectrum are establishing edge data centers to ensure cost-effective and high-functionality ways to provide customers with content and performance. A great way to learn more about the data center would be to understand its usage. The following are some of the services that primarily rely on edge computing: Internet of Things Internet of Things tools essentially require low latency and reliable connections to the data center to function with high intensity. IoT devices add up a vast number of edge computing utilities; thus using edge computing makes it simple and effective. Streaming Content Streaming content is one of the most consumed form of infotainment. Users today want their video to get started with a single click that edge facilities help achieve. Drones While Drones are increasingly getting popular, their features are also massively advancing. For example, with edge computing, drones could be controlled even from far-flung locations without any hitch. Artificial Intelligence AI is one of the most thriving technologies that have taken over the world with its magnificent scalability, To make AI advantageous to the system, it should be able to access data, process it, and communicate with the end-users effectively and quickly which an edge data center allows. Virtual Reality Virtual Reality needs to get updates as quickly as possible to create an immersive world for the users. Though primarily associated with gaming, VR has also gained recognition for different paradigms such as communication, education, and several other significant uses. Edge Computing and Data Centers – The Future! A dedicated 5G Provider Edge Computing is underway, building mammoth telecommunications capabilities into data center growth trends. These facilities could change the dynamics of 5G providers for enterprise brands and emerge as the dedicated 5G providers for organizations. Support sustainable business goals Edge data centers are being looked to as a periphery that can help build more efficient solutions to enable the sector’s sustainability. Edge computing is specifically designed to keep applications and data closer to devices and their users. Therefore, there is little doubt over the impact that edge computing will have on sustainable business goals. Making way for Robot Security Guards Evolution in AI and IoT has drastically changed the human staffing needs inside the data centers and made way for Robots. Currently, Robots have been deployed in some of the hyper-scale data centers for specific tasks. Whether it is the automated inspection, faulty disc locating, or disc charging, with Robots at the helm of affairs, everything can be completed seamlessly. Many data center and robotics professionals are predicting that the next couple of years will be big leaps when it comes to placing more robotics in the data center environment. Bill Kleyman - now Switch EVP of digital solutions - wrote in 2013. How Does One Choose a Location For a Data Center? Data centers are a critical part of any business enterprise operations. Hence, decisions regarding its locations cannot be relegated to an arbitrary choice. In the past, companies used to set up their edge data centers closer to their offices to maintain the proximity. However, that is swiftly changing now as the equipment administration and monitoring can be achieved remotely. With the data center industry transforming, performance is no longer the sole consideration. To create a defining success of the data centers, companies are now looking for different sites for their data centers, primarily focusing on factors like economic, political, social, and geographical. The current scenario highlights the significance of considering Energy efficiency, business continuity plan, and resource optimization. With so much at stake, the edge data centers should be effortlessly accessible. Conclusion Edge computing and data center growth has garnered a lot of interest among the users over the past few years. It will continue to thrive for many more years to come as it meets the eye of the global tech demands and the current and future needs of the users worldwide. Frequently Asked Questions What are the benefits of edge computing? One of the top benefits of edge computing is its quick response time and low latency period across all devices. It also simplifies the bandwidth and creates less risk in corporate security. What are the drawbacks of edge computing? A significant drawback of edge computing is the need of a huge storage capacity. The security challenge is also relatively high due to the massive amount of data stored in it. Moreover, the expensive cost factor is also a disadvantage of it. { "@context": "https://schema.org", "@type": "FAQPage", "mainEntity": [{ "@type": "Question", "name": "What are the benefits of edge computing?", "acceptedAnswer": { "@type": "Answer", "text": "One of the top benefits of edge computing is its quick response time and low latency period across all devices. It also simplifies the bandwidth and creates less risk in corporate." } },{ "@type": "Question", "name": "What are the drawbacks of edge computing?", "acceptedAnswer": { "@type": "Answer", "text": "A significant drawback of edge computing is the need of a huge storage capacity. The security challenge is also relatively high due to the massive amount of data stored in it. Moreover, the expensive cost factor is also a disadvantage of it." } }] }

Read More

Spotlight

ISP Supplies

ISP Supplies provides complete integrated global networking solutions and technologies to its customers. Since 2009, we have implemented wired and wireless equipment designed to handle ever increasing performance requirements.

Related News

Network Security

Ampliphae, HPE Athonet and Arqit deliver Quantum-Safe Private 5G using Symmetric Key Agreement

PR Newswire | January 19, 2024

Arqit Quantum Inc, a leader in quantum-safe encryption, and Ampliphae Ltd (Ampliphae), a leader in network cyber security solutions, have today announced successful completion of a project that will deliver enhanced quantum-safe security for Private 5G networks. The Security Enhanced Virtualised Networking for 5G (SEViN-5G) project, funded by Innovate UK, the UK Government’s innovation agency, leveraged Ampliphae’s network security analytics technology and Arqit’s Symmetric Key Agreement Platform to deliver a quantum-secure Private 5G testbed that can protect against both current and future cyber threats. Athonet, a Hewlett Packard Enterprise acquisition, provided the Radio Access Network (RAN) equipment for the project with a cloud core hosted on AWS. Private enterprise networks based on 5G cellular technology are accelerating digital transformation across industries including manufacturing, healthcare, defence and smart cities. Private 5G gives enterprises access to high-speed, massively scalable, and ultra-reliable wireless connectivity, allowing them to implement innovative IoT and mobile solutions that enhance productivity, drive automation and improve customer engagement. The security of these networks will be paramount as they will support safety-critical infrastructure and carry highly sensitive data. But like any new technology, 5G comes with potential new threats and security risks including the threat from quantum computing. The project finished in December 2023 and customer engagement has already begun. David Williams, Arqit Founder, Chairman and CEO said: “Enterprises want to deploy Private 5G networks with complete confidence that they will be safe from both current and future cyber threats including from quantum computers. Working alongside Ampliphae, we have shown that a quantum-safe Private 5G network is deliverable using Arqit’s unique encryption technology.” Trevor Graham, Ampliphae CEO said: “Private 5G can be hosted partly or completely in the Cloud, giving enterprises the opportunity to rapidly set up their own cellular networks customised to support their operations. With Ampliphae and Arqit they can now be certain that those Private 5G networks are monitored and secure against eavesdropping and disruption.” Nanda Menon, Senior Advisor Hewlett Packard Enterprise said: “In an era where security is paramount, the completion of the SEViN-5G project is a significant milestone. The delivery of a quantum-secure Private 5G testbed, achieved where Athonet have combined the Athonet core with CableFree radios, underscores the commitment to innovation and reinforces the confidence enterprises can have in deploying networks that are both cutting-edge and secure from both present and future threats.” About Arqit Arqit Quantum Inc. (Nasdaq: ARQQ, ARQQW) (Arqit) supplies a unique encryption Platform as a Service which makes the communications links of any networked device, cloud machine or data at rest secure against both current and future forms of attack on encryption – even from a quantum computer. Compliant with NSA standards, Arqit’s Symmetric Key Agreement Platform delivers a lightweight software agent that allows devices to create encryption keys locally in partnership with any number of other devices. The keys are computationally secure and operate over zero trust networks. It can create limitless volumes of keys with any group size and refresh rate and can regulate the secure entrance and exit of a device in a group. The agent is lightweight and will thus run on the smallest of end point devices. The Product sits within a growing portfolio of granted patents. It also works in a standards compliant manner which does not oblige customers to make a disruptive rip and replace of their technology. Recognised for groundbreaking innovation at the Institution of Engineering and Technology awards in 2023, Arqit has also won the Innovation in Cyber Award at the National Cyber Awards and Cyber Security Software Company of the Year Award at the Cyber Security Awards. Arqit is ISO 27001 Standard certified. www.arqit.uk About Ampliphae Ampliphae’s distributed network analytics technology provides insight into how networks are used to support enterprise operations at every level. A graduate of the prestigious LORCA cyber accelerator in London, and the AWS European Defence Accelerator, Ampliphae’s technology is already used by enterprises across multiple verticals to discover, analyse and secure the network traffic that supports their key applications and business processes. Ampliphae’s Encryption Intelligence product operates at enterprise scale to discover devices and applications that use cryptography, analysing their encryption capabilities to detect risks, including assets that are vulnerable to future quantum computer attack. Using Encryption Intelligence, the organisation can gather effective operational intelligence about their encryption landscape, both within and outside the organisation, and build an effective mitigation program to address current and future vulnerabilities.

Read More

Network Security

Cato Networks Introduces World's First SASE-based XDR

PR Newswire | January 25, 2024

Cato Networks, the leader in SASE, announced the expansion of the Cato SASE Cloud platform into threat detection and incident response with Cato XDR, the world's first SASE-based, extended detection and response (XDR) solution. Available immediately, Cato XDR utilizes the functional and operational capabilities of the Cato SASE Cloud to overcome the protracted deployment times, limited data quality, and inadequate investigation and response experience too often associated with legacy XDR solutions. Cato also introduced Cato EPP, the first SASE-managed endpoint protection platform (EPP/EDR). Together, Cato XDR and Cato EPP mark the first expansion beyond the original SASE scope pioneered by Cato in 2016 and defined by industry analysts in 2019. SASE's security capabilities encompassed threat prevention and data protection in a common, easy-to-manage, and easy-to-adopt global platform. With today's announcement, Cato is expanding SASE into threat detection, incident response, and endpoint protection without compromising on the architectural elegance captured by the original SASE definition. "Cato SASE continues to be the antidote to security complexity," says Shlomo Kramer, CEO and co-founder of Cato Networks. "Today, we extend our one-of-a-kind SASE platform beyond threat prevention and into threat detection and response. Only Cato and our simple, automated, and elegant platform can streamline security this way." An early adopter of Cato XDR is Redner's Markets, an employee-owned supermarket chain headquartered in Reading, Pennsylvania, with 75 locations. Redner's Markets' vice president of IT and Infrastructure, Nick Hidalgo, said, "The Cato platform gave us better visibility, saved time on incident response, resolved application issues, and improved network performance ten-fold." (Read more about Redner's Markets and Cato in this blog. "The convergence of XDR and EPP into SASE is not just another product; it's a game-changer for the industry," said Art Nichols, CTO of Windstream Enterprise, a Cato partner. "The innovative integration of these capabilities brings together advanced threat detection, response capabilities, and endpoint security within a unified, cloud-native architecture—revolutionizing the way enterprises protect their networks and data against increasingly sophisticated cyber threats." (Read more about what Cato partners are saying about today's news in this blog.) Platform vs. Product: The Difference Matters Cato XDR takes full advantage of the enormous benefits of the Cato SASE Cloud platform, the first platform built from the ground up to enable enterprises to connect, secure, and manage sites, users, and cloud resources anywhere in the world. Unlike disjointed point solutions and security appliances, Cato capabilities are instantly on, always available at scale, and fully converged, giving IT teams a single, shared context worldwide to understand their networks, prevent threats, and resolve problems. As an autonomous platform, Cato SASE Cloud sustains its evolution, resiliency, optimal performance, and security posture, saving enterprises the operational overhead of maintaining enterprise infrastructure. Enterprises simply subscribe to Cato to meet their business needs. Cato's cloud-native model revolutionized security and networking operations when it was introduced in 2016, a fact validated three years later in 2019 when the Cato approach was formally recognized by the industry as SASE. Breach Times Still Too Long; Limitations of Legacy XDR Cato is again revolutionizing cybersecurity with the first SASE platform to expand into threat detection, empowering security teams to become smarter and remediate incidents faster. The flood of security alerts triggered by network sensors, such as firewalls and IPS, complicates threat identification. In 2023, enterprises required 204 days on average to identify breaches.1 XDR tools help security analysts close this gap by ingesting, correlating, and contextualizing threat intelligence information with the data from native and third-party sensors. However, legacy XDR tools suffer from numerous problems relating to data quality. Sensor deployment extends the time-to-value as IT must not only install the sensors but also develop a baseline of specific organizational activity for accurate assessments. Data quality is also compromised when importing and normalizing third-party sensor data, complicating threat identification and incident response. Security analysts waste time sorting through incident stories to identify the ones most critical for immediate remediation. Once determined, incident remediation is often hampered by missing information, requiring analysts to master and switch between disparate tools. No wonder in 2023, average breach containment required more than two months.1 Cato XDR and Cato EPP Expands the Meaning of SASE Cato XDR addresses legacy XDR's limitations. Instantly activated globally, Cato XDR provides enterprises with immediate insights into threats on their networks. Incident detection is accurate due to Cato's many native sensors – NGFW, advanced threat prevention (IPS, NGAM, and DNS Security), SWG, CASB, DLP, ZTNA, RBI, and now EPP/EDR. Powered by Bitdefender's world-leading malware prevention technology, Cato EPP protects endpoints from attack – in the Cato way. Endpoint threat and user data are stored in the same converged Cato data lake as the rest of the customer's network data, simplifying cross-domain event correlation. The result is incredibly high-quality data that improves the incident identification and remediation process. Cato AI uses the data to accurately identify and rank incidents, empowering analysts to focus critical resources on an organization's most important remediation cases. Cato AI is battle-tested and proven across years of threat hunting and remediation handling by Cato MDR service agents. Remediation times reduce as detected incident stories contain the relevant information for in-depth investigation. Cato's tools sit in the same console as the native engines, enabling security analysts to view everything in one place -- the current security policy and the reviewed story. Finally, incident reporting is simplified with generative AI. Purpose-built for investigations, this natural language engine provides human-readable explanations of incident stories. Analysts save time sharing incident information with other teams and reporting to their managers.

Read More

Network Infrastructure

DISH Wireless Awarded $50 Million NTIA Grant for 5G Open RAN Integration and Deployment Center

PR Newswire | January 16, 2024

DISH Wireless, a subsidiary of EchoStar, was awarded a historic $50 million grant from the U.S. Department of Commerce's National Telecommunications and Information Administration (NTIA) to establish the Open RAN Center for Integration & Deployment (ORCID). ORCID will allow participants to test and validate their hardware and software solutions (RU, DU and CU) against a complete commercial-grade Open RAN network deployed by DISH. "The Open RAN Center for Integration and Deployment (ORCID) will serve a critical role in strengthening the global Open RAN ecosystem and building the next generation of wireless networks," said Charlie Ergen, co-founder and chairman, EchoStar. "By leveraging DISH's experience deploying the world's first standalone Open RAN 5G network, ORCID will be uniquely positioned to test and evaluate Open RAN interoperability, performance and security from domestic and international vendors. We appreciate NTIA's recognition of DISH and ORCID's role in driving Open RAN innovation and the Administration's ongoing commitment to U.S. leadership in wireless connectivity." To date, this grant represents NTIA's largest award under the Public Wireless Supply Chain Innovation Fund (Innovation Fund). ORCID will be housed in DISH's secure Cheyenne, Wyoming campus and will be supported by consortium partners Fujitsu, Mavenir and VMware by Broadcom and technology partners Analog Devices, ARM, Cisco, Dell Technologies, Intel, JMA Wireless, NVIDIA, Qualcomm and Samsung. NTIA Administrator Alan Davidson and Innovation Fund Director Amanda Toman will join EchoStar Co-Founder and Chairman Charlie Ergen, EchoStar CEO Hamid Akhavan, EVP and Chief Network Officer Marc Rouanne and other stakeholders to announce the grant and tour a DISH 5G Open RAN cell site later today in Las Vegas. During this event, DISH will outline ORCID's unique advantages, including that it will leverage DISH's experience as the only operator in the United States to commercially deploy a standalone Open RAN 5G network. DISH and its industry partners have validated Open RAN technology at scale across the country; today DISH's network covers over 246 million Americans nationwide. At ORCID, participants will be able to test and evaluate individual or multiple network elements to ensure Open RAN interoperability, performance and security, and contribute to the development, deployment and adoption of open and interoperable standards-based radio access networks. ORCID's "living laboratory" will drive the Open RAN ecosystem — from lab testing to commercial deployment. Below are highlights of ORCID: ORCID will combine both lab and field testing and evaluation activities. ORCID will be able to test elements brought by any qualified vendor against DISH's live, complete and commercial-grade Open RAN stack. ORCID will use DISH's spectrum holdings, a combination of low-, mid- and high-band frequencies, enabling field testing and evaluation. ORCID will evaluate Open RAN elements through mixing and matching with those of other vendors, rather than validating a single vendor's stack. DISH's experience in a multi-vendor environment will give ORCID unique insights about the integration of Open RAN into brownfield networks. ORCID's multi-tenant lab and field testing will occur in DISH's secure Cheyenne, Wyoming facility, which is already compliant with stringent security protocols in light of its satellite functions. About DISH Wireless DISH Wireless, a subsidiary of EchoStar Corporation (NASDAQ: SATS), is changing the way the world communicates with the Boost Wireless Network. In 2020, the company became a nationwide U.S. wireless carrier through the acquisition of Boost Mobile. The company continues to innovate in wireless, building the nation's first virtualized, Open RAN 5G broadband network, and is inclusive of the Boost Infinite, Boost Mobile and Gen Mobile wireless brands.

Read More

Network Security

Ampliphae, HPE Athonet and Arqit deliver Quantum-Safe Private 5G using Symmetric Key Agreement

PR Newswire | January 19, 2024

Arqit Quantum Inc, a leader in quantum-safe encryption, and Ampliphae Ltd (Ampliphae), a leader in network cyber security solutions, have today announced successful completion of a project that will deliver enhanced quantum-safe security for Private 5G networks. The Security Enhanced Virtualised Networking for 5G (SEViN-5G) project, funded by Innovate UK, the UK Government’s innovation agency, leveraged Ampliphae’s network security analytics technology and Arqit’s Symmetric Key Agreement Platform to deliver a quantum-secure Private 5G testbed that can protect against both current and future cyber threats. Athonet, a Hewlett Packard Enterprise acquisition, provided the Radio Access Network (RAN) equipment for the project with a cloud core hosted on AWS. Private enterprise networks based on 5G cellular technology are accelerating digital transformation across industries including manufacturing, healthcare, defence and smart cities. Private 5G gives enterprises access to high-speed, massively scalable, and ultra-reliable wireless connectivity, allowing them to implement innovative IoT and mobile solutions that enhance productivity, drive automation and improve customer engagement. The security of these networks will be paramount as they will support safety-critical infrastructure and carry highly sensitive data. But like any new technology, 5G comes with potential new threats and security risks including the threat from quantum computing. The project finished in December 2023 and customer engagement has already begun. David Williams, Arqit Founder, Chairman and CEO said: “Enterprises want to deploy Private 5G networks with complete confidence that they will be safe from both current and future cyber threats including from quantum computers. Working alongside Ampliphae, we have shown that a quantum-safe Private 5G network is deliverable using Arqit’s unique encryption technology.” Trevor Graham, Ampliphae CEO said: “Private 5G can be hosted partly or completely in the Cloud, giving enterprises the opportunity to rapidly set up their own cellular networks customised to support their operations. With Ampliphae and Arqit they can now be certain that those Private 5G networks are monitored and secure against eavesdropping and disruption.” Nanda Menon, Senior Advisor Hewlett Packard Enterprise said: “In an era where security is paramount, the completion of the SEViN-5G project is a significant milestone. The delivery of a quantum-secure Private 5G testbed, achieved where Athonet have combined the Athonet core with CableFree radios, underscores the commitment to innovation and reinforces the confidence enterprises can have in deploying networks that are both cutting-edge and secure from both present and future threats.” About Arqit Arqit Quantum Inc. (Nasdaq: ARQQ, ARQQW) (Arqit) supplies a unique encryption Platform as a Service which makes the communications links of any networked device, cloud machine or data at rest secure against both current and future forms of attack on encryption – even from a quantum computer. Compliant with NSA standards, Arqit’s Symmetric Key Agreement Platform delivers a lightweight software agent that allows devices to create encryption keys locally in partnership with any number of other devices. The keys are computationally secure and operate over zero trust networks. It can create limitless volumes of keys with any group size and refresh rate and can regulate the secure entrance and exit of a device in a group. The agent is lightweight and will thus run on the smallest of end point devices. The Product sits within a growing portfolio of granted patents. It also works in a standards compliant manner which does not oblige customers to make a disruptive rip and replace of their technology. Recognised for groundbreaking innovation at the Institution of Engineering and Technology awards in 2023, Arqit has also won the Innovation in Cyber Award at the National Cyber Awards and Cyber Security Software Company of the Year Award at the Cyber Security Awards. Arqit is ISO 27001 Standard certified. www.arqit.uk About Ampliphae Ampliphae’s distributed network analytics technology provides insight into how networks are used to support enterprise operations at every level. A graduate of the prestigious LORCA cyber accelerator in London, and the AWS European Defence Accelerator, Ampliphae’s technology is already used by enterprises across multiple verticals to discover, analyse and secure the network traffic that supports their key applications and business processes. Ampliphae’s Encryption Intelligence product operates at enterprise scale to discover devices and applications that use cryptography, analysing their encryption capabilities to detect risks, including assets that are vulnerable to future quantum computer attack. Using Encryption Intelligence, the organisation can gather effective operational intelligence about their encryption landscape, both within and outside the organisation, and build an effective mitigation program to address current and future vulnerabilities.

Read More

Network Security

Cato Networks Introduces World's First SASE-based XDR

PR Newswire | January 25, 2024

Cato Networks, the leader in SASE, announced the expansion of the Cato SASE Cloud platform into threat detection and incident response with Cato XDR, the world's first SASE-based, extended detection and response (XDR) solution. Available immediately, Cato XDR utilizes the functional and operational capabilities of the Cato SASE Cloud to overcome the protracted deployment times, limited data quality, and inadequate investigation and response experience too often associated with legacy XDR solutions. Cato also introduced Cato EPP, the first SASE-managed endpoint protection platform (EPP/EDR). Together, Cato XDR and Cato EPP mark the first expansion beyond the original SASE scope pioneered by Cato in 2016 and defined by industry analysts in 2019. SASE's security capabilities encompassed threat prevention and data protection in a common, easy-to-manage, and easy-to-adopt global platform. With today's announcement, Cato is expanding SASE into threat detection, incident response, and endpoint protection without compromising on the architectural elegance captured by the original SASE definition. "Cato SASE continues to be the antidote to security complexity," says Shlomo Kramer, CEO and co-founder of Cato Networks. "Today, we extend our one-of-a-kind SASE platform beyond threat prevention and into threat detection and response. Only Cato and our simple, automated, and elegant platform can streamline security this way." An early adopter of Cato XDR is Redner's Markets, an employee-owned supermarket chain headquartered in Reading, Pennsylvania, with 75 locations. Redner's Markets' vice president of IT and Infrastructure, Nick Hidalgo, said, "The Cato platform gave us better visibility, saved time on incident response, resolved application issues, and improved network performance ten-fold." (Read more about Redner's Markets and Cato in this blog. "The convergence of XDR and EPP into SASE is not just another product; it's a game-changer for the industry," said Art Nichols, CTO of Windstream Enterprise, a Cato partner. "The innovative integration of these capabilities brings together advanced threat detection, response capabilities, and endpoint security within a unified, cloud-native architecture—revolutionizing the way enterprises protect their networks and data against increasingly sophisticated cyber threats." (Read more about what Cato partners are saying about today's news in this blog.) Platform vs. Product: The Difference Matters Cato XDR takes full advantage of the enormous benefits of the Cato SASE Cloud platform, the first platform built from the ground up to enable enterprises to connect, secure, and manage sites, users, and cloud resources anywhere in the world. Unlike disjointed point solutions and security appliances, Cato capabilities are instantly on, always available at scale, and fully converged, giving IT teams a single, shared context worldwide to understand their networks, prevent threats, and resolve problems. As an autonomous platform, Cato SASE Cloud sustains its evolution, resiliency, optimal performance, and security posture, saving enterprises the operational overhead of maintaining enterprise infrastructure. Enterprises simply subscribe to Cato to meet their business needs. Cato's cloud-native model revolutionized security and networking operations when it was introduced in 2016, a fact validated three years later in 2019 when the Cato approach was formally recognized by the industry as SASE. Breach Times Still Too Long; Limitations of Legacy XDR Cato is again revolutionizing cybersecurity with the first SASE platform to expand into threat detection, empowering security teams to become smarter and remediate incidents faster. The flood of security alerts triggered by network sensors, such as firewalls and IPS, complicates threat identification. In 2023, enterprises required 204 days on average to identify breaches.1 XDR tools help security analysts close this gap by ingesting, correlating, and contextualizing threat intelligence information with the data from native and third-party sensors. However, legacy XDR tools suffer from numerous problems relating to data quality. Sensor deployment extends the time-to-value as IT must not only install the sensors but also develop a baseline of specific organizational activity for accurate assessments. Data quality is also compromised when importing and normalizing third-party sensor data, complicating threat identification and incident response. Security analysts waste time sorting through incident stories to identify the ones most critical for immediate remediation. Once determined, incident remediation is often hampered by missing information, requiring analysts to master and switch between disparate tools. No wonder in 2023, average breach containment required more than two months.1 Cato XDR and Cato EPP Expands the Meaning of SASE Cato XDR addresses legacy XDR's limitations. Instantly activated globally, Cato XDR provides enterprises with immediate insights into threats on their networks. Incident detection is accurate due to Cato's many native sensors – NGFW, advanced threat prevention (IPS, NGAM, and DNS Security), SWG, CASB, DLP, ZTNA, RBI, and now EPP/EDR. Powered by Bitdefender's world-leading malware prevention technology, Cato EPP protects endpoints from attack – in the Cato way. Endpoint threat and user data are stored in the same converged Cato data lake as the rest of the customer's network data, simplifying cross-domain event correlation. The result is incredibly high-quality data that improves the incident identification and remediation process. Cato AI uses the data to accurately identify and rank incidents, empowering analysts to focus critical resources on an organization's most important remediation cases. Cato AI is battle-tested and proven across years of threat hunting and remediation handling by Cato MDR service agents. Remediation times reduce as detected incident stories contain the relevant information for in-depth investigation. Cato's tools sit in the same console as the native engines, enabling security analysts to view everything in one place -- the current security policy and the reviewed story. Finally, incident reporting is simplified with generative AI. Purpose-built for investigations, this natural language engine provides human-readable explanations of incident stories. Analysts save time sharing incident information with other teams and reporting to their managers.

Read More

Network Infrastructure

DISH Wireless Awarded $50 Million NTIA Grant for 5G Open RAN Integration and Deployment Center

PR Newswire | January 16, 2024

DISH Wireless, a subsidiary of EchoStar, was awarded a historic $50 million grant from the U.S. Department of Commerce's National Telecommunications and Information Administration (NTIA) to establish the Open RAN Center for Integration & Deployment (ORCID). ORCID will allow participants to test and validate their hardware and software solutions (RU, DU and CU) against a complete commercial-grade Open RAN network deployed by DISH. "The Open RAN Center for Integration and Deployment (ORCID) will serve a critical role in strengthening the global Open RAN ecosystem and building the next generation of wireless networks," said Charlie Ergen, co-founder and chairman, EchoStar. "By leveraging DISH's experience deploying the world's first standalone Open RAN 5G network, ORCID will be uniquely positioned to test and evaluate Open RAN interoperability, performance and security from domestic and international vendors. We appreciate NTIA's recognition of DISH and ORCID's role in driving Open RAN innovation and the Administration's ongoing commitment to U.S. leadership in wireless connectivity." To date, this grant represents NTIA's largest award under the Public Wireless Supply Chain Innovation Fund (Innovation Fund). ORCID will be housed in DISH's secure Cheyenne, Wyoming campus and will be supported by consortium partners Fujitsu, Mavenir and VMware by Broadcom and technology partners Analog Devices, ARM, Cisco, Dell Technologies, Intel, JMA Wireless, NVIDIA, Qualcomm and Samsung. NTIA Administrator Alan Davidson and Innovation Fund Director Amanda Toman will join EchoStar Co-Founder and Chairman Charlie Ergen, EchoStar CEO Hamid Akhavan, EVP and Chief Network Officer Marc Rouanne and other stakeholders to announce the grant and tour a DISH 5G Open RAN cell site later today in Las Vegas. During this event, DISH will outline ORCID's unique advantages, including that it will leverage DISH's experience as the only operator in the United States to commercially deploy a standalone Open RAN 5G network. DISH and its industry partners have validated Open RAN technology at scale across the country; today DISH's network covers over 246 million Americans nationwide. At ORCID, participants will be able to test and evaluate individual or multiple network elements to ensure Open RAN interoperability, performance and security, and contribute to the development, deployment and adoption of open and interoperable standards-based radio access networks. ORCID's "living laboratory" will drive the Open RAN ecosystem — from lab testing to commercial deployment. Below are highlights of ORCID: ORCID will combine both lab and field testing and evaluation activities. ORCID will be able to test elements brought by any qualified vendor against DISH's live, complete and commercial-grade Open RAN stack. ORCID will use DISH's spectrum holdings, a combination of low-, mid- and high-band frequencies, enabling field testing and evaluation. ORCID will evaluate Open RAN elements through mixing and matching with those of other vendors, rather than validating a single vendor's stack. DISH's experience in a multi-vendor environment will give ORCID unique insights about the integration of Open RAN into brownfield networks. ORCID's multi-tenant lab and field testing will occur in DISH's secure Cheyenne, Wyoming facility, which is already compliant with stringent security protocols in light of its satellite functions. About DISH Wireless DISH Wireless, a subsidiary of EchoStar Corporation (NASDAQ: SATS), is changing the way the world communicates with the Boost Wireless Network. In 2020, the company became a nationwide U.S. wireless carrier through the acquisition of Boost Mobile. The company continues to innovate in wireless, building the nation's first virtualized, Open RAN 5G broadband network, and is inclusive of the Boost Infinite, Boost Mobile and Gen Mobile wireless brands.

Read More

Events