The techUK Data Centres Council is working with DCMS to ensure UK data centre employees are included in a list of critical infrastructure workers that would be exempt from lockdown measures that limited free movement in London.
If data centre employees are not included in the final list of critical infrastructure workers -- due for publication in a matter of days -- or at least designated as key workers, data centres without remote operating capabilities could be forced to house and feed business-critical staff onsite.
The post UK data centre operators ready to ‘camp out’ in event of London lockdown appeared first on Techerati.
Google revealed a set of wheels supporting one of its data centre racks buckled, precipitating a chain of events that resulted in some CPUs overheating, disrupting Search, Gmail, and other services for some users.
The unusual episode was discovered after a site reliability engineer on the company's traffic and load balancing was alerted that Google services being supported by its edge network were producing an abnormally high number of errors.
The post How a data centre flat tyre disrupted Google services appeared first on Techerati.
Data centre infrastructure specialists Vertiv have announced a partnership with Uptime Insitute to deliver TIER-ready prefabricated modular data centres.
The partnership will see Vertiv sell Uptime Institute Tier Certification of Constructed Facilities (TCCF) services with Vertiv TIER-Ready Modular Units, allowing the units to quickly earn TCCF certification once deployed.
The post Vertiv teams up with Uptime Institute on TIER-ready modular data centres appeared first on Techerati.
The US Department of Energy's (DoE) upcoming El Capitan supercomputer will be capable of 2 exaflops of computing performance, making it more powerful than the top 200 fastest supercomputers combined.
The record-breaking supercomputer, which is expected to be delivered in early 2023 and will be located at the Lawrence Livermore National Laboratory (LLNL) in California, will be used by the DoE's National Nuclear Security Administration to advance America's nuclear security missions.
The post El Capitan supercomputer will be 10 times as powerful as anything we’ve seen before appeared first on Techerati.
Google is stepping up efforts to attract international customers to its cloud platform, announcing four new cloud regions on top of the four it was already planning to launch this year.
The Alphabet subsidiary, which trails Microsoft and Amazon in the public cloud market, revealed plans to open new cloud regions in Delhi, Doha, Melbourne and Toronto, all cities located in countries with existing data centre regions.
Operational technology multinational Honeywell has claimed it has cracked a quantum computing conundrum that will pave the way for the "world's most powerful quantum computer".
Honeywell added that it expected to release the record-breaking system within the next three months.
Quantum computers leverage qubits instead of bits to solve problems that ordinary computers would take millions or even billions of years to solve and are roundly expected to accelerate applications such as drug development, weather forecasts and materials design.
The post Honeywell teases “world’s most powerful quantum computer” appeared first on Techerati.
As the electric and digital worlds converge, talent acquisition and team management are among the biggest challenges faced by data centre and technology businesses today.
According to the Harvey Nash/KPMG CIO Survey, “The single highest cause of stress is being short of staff”, and this has become a major issue. The survey found that the UK's tech industry is experiencing the highest skills shortage for more than a decade, with almost two thirds of CIOs (64%) reporting a shortfall of talent.
The post Open to all, time for the data centre sector to end the skills issue appeared first on Techerati.
Underpinning today’s data revolution are data architectures which define how data is stored, arranged, managed and used.
With the rise of data-intensive applications such as AI and analytics and the deluge of unstructured data brought about by IoT, there is a growing need for more efficient and flexible data architectures so that organisations can keep data centre costs low and increase speed, agility and time-to-value for data initiatives.
The post Western Digital’s Manfred Berger on the disaggregation of the data centre appeared first on Techerati.
Those in the data centre industry today know that we are living in exciting times. Just 5-10 years ago, we were using buzz words like Internet of Things, machine learning, 5G, hyperscale, cloud computing, edge computing, etc.
These things are now very real and are forming the catalyst of the data centre boom we are currently experiencing. The world has caught on to the use of technology in virtually every aspect of our lives from teacherless classrooms using extended reality (XR) to autonomous driving; from the fully automated and connected home to advances in medical technological applications.
Corporate IT infrastructure has never been so complex. There are a host of different options open to organisations, from running on public cloud and on-premise data centres, to SaaS cloud native capabilities and serverless infrastructure. Applications are running across an ever-wider range of technologies and geographies, making it increasingly difficult to monitor and troubleshoot when something goes wrong. Unsurprisingly, companies often struggle with a messy array of different tools and technologies based in these different locations.
Application visibility is one particularly notable pain point. An estimated 80 percent of enterprises have gaps in monitoring their cloud or are totally blind to it. In these scenarios, customer experience (CX) is often hit hard. If an organisation lacks real-time visibility over application performance, the risk increases of gaps emerging between its internal view and actual user perception of how the app is performing.
Often, we struggle to discuss the edge - one of the IT world’s hottest trends - because it’s difficult to objectively define it. “The edge means something different to every person,” says Mark Howell, of the Ford Motor Company. Howell oversees the construction of every new IT facility the veteran automaker builds and is the lead for EMEA region design, planning and engineering.
From his perspective, the edge is effectively distributed technology, and need not encompass servers, storage and switches. By that token, Ford’s first edge site wasn’t a micro data centre, but the first remote offices and factories that Henry Ford built all those years ago. The company’s Paris office opened in 1908, the Kansas City assembly plant opened in 1911. By the end of the 1920’s Ford had more than 20 overseas assembly plants.
The post The challenges of scaling the edge, with Mark Howell, Ford Motor Company appeared first on Techerati.
With data demands reaching new levels in 2020, the role of the data centre is set to take centre stage for IT leaders. Against a backdrop of constant disruption and increasingly ambitious enterprise and cloud strategies, how do you ensure that your data centre is futureproofed, so that you stay ahead of the game, rather than react to it?