Sunday 31 October 2021

StarlingX: Open-source Software at the Cutting Edge

Dell EMC Study Material, Dell EMC Exam, Dell EMC Tutorial and Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Jobs

Just as the evolution of the smart phone was enabled by the transition from 3G to 4G, use cases for connected devices will evolve with 5G. There will be connected appliances, connected cars, connected factories and even connected cities that will each require the low latency and localized processing power of 5G to function. For communications service providers (CSPs), meeting the challenges of 5G isn’t simply a matter of building bigger networks. Instead, they will need to break down their big networks into smaller, disaggregated components that are closer to the source of these connected devices.

In this new network model, the edge becomes increasingly important. Unlike the core-centric network model of the past, 5G networks will require much of the heavy lifting of processing and analytics to be performed closer to the user, running on a very small footprint. In the case of smart factories, the edge could be multi-access edge computing (MEC) within a hosted private 5G network. For online gaming, this could mean adding compute capacity to the edge of the telecom network. In both cases, CSPs will need to deploy these edge resources quickly and cost-effectively if they hope to monetize 5G opportunities.

A purpose-built, cloud native platform for the edge

Container-based cloud platforms are the logical path forward for 5G networks. Containers provide a workable foundation for disaggregating network functions that is highly scalable and easier to manage. Software and hardware are separated using container network functions (CNFs) orchestrated as scalable building blocks via Kubernetes. Standards-based tools allow these components to be remotely configured and managed from a central point of control. Yet, even within cloud environments, edge applications have unique considerations for security, performance, durability, and management. Adding to the complexity is the fact that the edge cloud still has multiple servers, only they are distributed across a wide geographical area versus all sitting in a single data center.

Because the edge has unique requirements, StarlingX was built for the edge first. StarlingX is an open source, telco-grade, cloud environment optimized for distributed edge applications that is gaining industry momentum. It is based on Kubernetes, OpenStack and other open-source cloud technologies and is designed specifically to meet the low latency and small footprint requirements at the edge. In fact, StarlingX is the foundational cloud software used in the real time platform built into the Infrastructure (INF) project of the O-RAN Alliance that enables software developers to develop, test and run O-RAN compliant RAN deployments.

Wind River Studio is a commercial implementation of StarlingX that includes a complete cloud platform, automation and orchestration tools and real-time analytics. Vodafone recently announced they have selected Wind River as a key supplier for their Open RAN rollout starting in the UK. With Wind River Studio, CSPs can safely and quickly deploy an edge cloud environment to support use cases such as MEC, IoT and vRAN on a proven and fully supported commercial platform. Earlier this month, Dell Technologies announced that it has jointly developed a reference architecture with Wind River for CSPs that provides a complete, end-to-end edge solution featuring Wind River Studio and Dell Technologies PowerEdge servers and PowerSwitch networking. This reference architecture delivers everything that CSPs need to quickly deploy, scale and manage resources anywhere in their 5G network.

Achieving lower costs and less complexity 

There are many benefits to deploying edge services with StarlingX, but let’s start with the big one: cost. In a recent report, Enterprise Strategy Group (ESG) found that moving to a disaggregated vRAN infrastructure running Wind River Studio’s commercial distribution of StarlingX saved CSPs up to 75% versus traditional appliance-based RAN systems per node. That number was slightly lower (67%) but still impressive for dual-redundancy nodes.

StarlingX reduces operational overhead by providing single-pane-of-glass management and zero-touch provisioning for the entire edge infrastructure. CSPs can manage, configure and automate upgrades/rollbacks of edge servers through standards-based Redfish APIs. This greatly accelerates the deployment time of edge infrastructure, enabling CSPs to go from bare metal to fully functioning edge capabilities from day one. In the development of their reference architecture, Dell Technologies and Wind River validated that it could automate server provisioning and management tasks leveraging Redfish APIs. These Redfish APIs are embedded in the integrated Dell Remote Access Controller which is included with every Dell server. This provides operators with a commercially supported solution that automates the deployment and management of nodes out to the far edge of the operator’s network.

Dell EMC Study Material, Dell EMC Exam, Dell EMC Tutorial and Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Jobs

StarlingX also generates real-time network analytics using machine learning tools, which can be used to optimize network configurations, performance, improve capacity planning and even create new services.

StarlingX is an open-source cloud platform built to meet the latency, cost, reliability and management requirements of the far edge. It has an active community around it and the contributors are continuously working on evolving the platform to fulfill the requirements of new edge use cases. Scaling from a single node to thousands of nodes, StarlingX is an advanced edge solution that is being adopted and supported by industry leaders like Vodafone, Wind River and Dell Technologies.

A flood of 5G applications is just around the corner. Give yourself the edge you need to capitalize on 5G revenue opportunities by taking a closer look at StarlingX. If you are already using StarlingX, the community would love to hear from you by taking the StarlingX user community survey.

Source: delltechnologies.com

Saturday 30 October 2021

Is 5G Already Transforming an Enterprise Near You?

Much has been said about the imperative for enterprises to automate operations and jump-start a new era of data-driven insight and near real time business decisions. Edge computing is a means to this end and an industry hot topic, as organizations of all kinds seek to act on data near the point of creation to generate immediate, essential value. That makes 5G a hot topic as well, since 5G enables exciting new use cases at the enterprise edge.

5G is much more than just the generation following 4G; it is a revolutionary communications upgrade that bridges IT and telecom infrastructures under a horizontal cloud-based architecture that enables unprecedented economies of scale. If that were not enough, 5G can use more frequency bands and deliver lower latency and higher throughput to connect virtually everyone and everything — including people, machines, objects and devices. It promises to drive the development of game-changing applications at the edge, enabling much more widespread use of Internet of Things (IoT), artificial intelligence and real time data analytics for effective decision making.

However, as promising as 5G is, the 5G transition will be much like other waves of technological change, where new generations of technologies enable new use cases but do not fully replace the previous generations. In today’s edge computing environments, 5G is moving in alongside 4G and Wi-Fi private wireless networks in a blended approach to connectivity that is optimized for each use case.

Dell EMC Study, Dell EMC Career, Dell EMC Prep, Dell EMC Preparation, Dell EMC Tutorial and Materials, Dell EMC Jobs

Private Wi-Fi


Organizations use Wi-Fi in many environments—for example, to connect users within campuses, manufacturing plants or remote and branch office and retail locations. Wi-Fi costs less than cellular systems, is straightforward to deploy and maintain and offers a multitude of compatible devices and a spectrum that does not require licensing. These characteristics make it ideal for small enterprises requiring general connectivity and less critical communications services that do not have intense security and reliability requirements.

Most enterprises currently have a private Wi-Fi network strategy in place and will transition to 4G or 5G only where the inherent drawbacks of Wi-Fi networks are impacting business outcomes. Drawbacks include the fact that Wi-Fi requires more access points compared to 4G and 5G, which can increase reliability issues and management complexity as the system scales. Signals are also subject to interference and are less secure and reliable than with 4G and 5G. In addition, users cannot transition seamlessly to other networks when they leave the Wi-Fi signal area.

Because of these drawbacks, organizations will add 4G or 5G connectivity in addition to Wi-Fi for use cases that require more devices, more security and reliability and faster, seamless connectivity across networks.

Private 4G LTE


Introduced in 2009, 4G long term evolution (LTE) delivers better speeds, security and reliability than Wi-Fi does. 4G is also interference-resistant and offers seamless mobility between on-premises access points and outside private or public networks. The energy efficient access points support many more simultaneously connected devices. However, 4G is more complex to deploy, operate and maintain than Wi-Fi is and has a limited spectrum available for enterprises.

Because 4G is a well-established standard with a vast and mature ecosystem of supported devices and applications, organizations will continue to use 4G for many enterprise use cases that are not suitable for Wi-Fi.

Private 4G networks can also be deployed on the Citizens Broadband Radio Service (CBRS) band, simplifying enterprise access to the 4G spectrum. However, as organizations expand their edge computing use cases and capabilities, connecting data from many thousands of devices and automating split-second responses, more use cases will transition to 5G.

Private 5G


For use cases that require high capacity, reliability, security and/or ultra-low latency, 5G is the standard of choice. It offers a 20X broader frequency spectrum and is highly interference resistant. As the most advanced standard, 5G currently has a maturing device ecosystem; however, most new implementations will use 5G going forward, which will rapidly diminish this disadvantage. Most industry communications players are strongly supporting the formation of a 5G device ecosystem.

Dell EMC Study, Dell EMC Career, Dell EMC Prep, Dell EMC Preparation, Dell EMC Tutorial and Materials, Dell EMC Jobs

With abundant spectrum being assigned for 5G use around the world, we will see private 5G shine as the default standard of choice for enterprises. The latest iteration of 5G, recently released by the 3rd Generation Partnership Project (3GPP), allows 5G operation in unlicensed bands. Various regulators around the world will permit 5G to use the 6GHz unlicensed band, unleashing a whopping amount of spectrum easily accessible by enterprises. Spectrum-demanding use cases, such as holographic communications, 3D quality inspection, augmented reality/virtual reality (AR/VR) and near real-time video processing, will be at the hands of enterprises enabling their digital transformation.

Furthermore, 5G is particularly well suited for environments using enormous amounts of IoT devices. Its flexible resource assignation enables thousands of edge devices to interconnect with computing systems to return instantaneous insights.

Enterprises — make your plans to incorporate 5G


5G is essential to the enterprise digital transformation journey. Enhancements in the recent 3GPP release 16, and the upcoming release 17, include extensions to existing features, as well as features that enable enterprises to address new use cases and deployment scenarios and even the potential for new verticals to form. Operation in the unlicensed spectrum, intelligent transportation systems, industrial IoT and non-terrestrial networks (satellites, unmanned aerial vehicles and high-altitude platforms) are just a few of the new capabilities addressed by new releases of 5G. And more new capabilities are on the 3GPP drawing board.

Many 5G-enabled use cases are emerging, and 5G implementations will live alongside existing and new generations of Wi-Fi and 4G for some time. Right now, adopting a flexible infrastructure that is ready to adapt to future use cases — many of which cannot be predicted today — is a solid approach that will prepare you for mainstream adoption of 5G and what comes next.

Source: delltechnologies.com

Thursday 28 October 2021

Close Encounters with the Third Premises

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Preparation, Dell EMC Career, Dell EMC Certification

Aliens were a big deal when I was growing up, and they featured prominently in pop culture. A lot of great sci-fi alien movies came out in the late 1970s and early ‘80s, including the rather sophisticated Close Encounters of the Third Kind, written and directed by Steven Spielberg. Most people will remember it from the distinctive five-note musical phrase, D’ E’ C’ C G.

The movie takes its title from the Hynek scale, UFO expert J. Allen Hynek’s classification of close encounters with extraterrestrials, used to classify the degree of contact from vague sighting to hanging out with a living, breathing alien.

And, yes, I’m actually going to tie that into current information technology conversations.

What do I mean by third premises? The most persistent subject of cloud debate is on-premises versus off-premises. For most, on-premises refers to traditional data center environments and off-premises to public cloud data centers. Folks tend to twist themselves into knots trying to prove that on-prem versus off-prem is a zero-sum phenomenon. I believe that’s false, but it also misses out on a more interesting trend. What happens when all that IT equipment doesn’t find its way into either?

Increasingly, infrastructure and software are being deployed in the world around us — that’s the third premises. And these deployments defy the conventional false dichotomy (see my earlier blog, The Transformation Knothole, for more thoughts on this).

Nothing travels faster than the speed of light, not even great news

We’ve reached the peak of traditional centralized data and computer warehousing, and we’re seeing a fundamental shift in where computing happens and data is generated. Moving from thousands of systems in hundreds of locations to millions of systems and locations creates a scaling problem, especially for public cloud vendors. Glory awaits whoever can figure out this scaling issue – and we have a nice head start here at Dell Technologies.

Deployments are growing very fast, and I’m going to summon all of my sci-fi nerd power and tell you it’s all because of the speed of light (are you having flashbacks to high school science class yet).

The speed of light governs the rate at which information can be transmitted. It’s wicked fast, but not fast enough in the world of modern computing. It causes latency – or a measure of delay. Throw in some additional latency driven by equipment, and lag time starts to pile up quickly.

AWS publishes a handy chart that cites the latencies between its global data centers. Data centers in Northern Virginia and Central Ohio, where Amazon data center zones are closest, are 12 milliseconds (ms) apart under ideal conditions. That stretches to 72ms to traverse the country from Northern Virginia to Northern California. Both the 12ms and 72ms numbers reflect a highly engineered, owned and operated network. Those will be much higher where network conditions are not as ideal.

Have you recently had a close encounter (really close)?

Until somebody can harness something like quantum entanglement (back to high school physics, everyone) to create latency-free communications, we’ll have to live with it. But we are on the verge of a great swing of the pendulum from a state that Microsoft CEO Satya Nadella characterized as “peak centralization” toward a much more distributed IT environment. The operator of the second-largest public cloud, Microsoft realizes that centralization is good for business, but also that centralized public cloud services are not suitable for everything.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Preparation, Dell EMC Career, Dell EMC Certification

Businesses are rapidly digitizing their operations, coupling physical with digital spheres to drive improved outcomes of all kinds: safety and quality in manufacturing, security in public spaces, productivity and automation in warehouses, environmental efficiency in enclosed spaces and experiences in entertainment venues.

Delivering these improvements requires that the digital systems operate in real-time, but moving data thousands of miles to be processed is the enemy of real-time execution (see latency above). We need to put the infrastructure where the data is being generated. That means the systems that bridge the digital and physical spheres need to be close, really close.

As proximity of systems becomes critical, deployments of IT equipment outside of data center environments – third premises — will likely dwarf what we have seen over the past decade in terms of the scope and scale of public cloud.

You can’t fool us by agreeing with us

You’ve probably already had close encounters with third premises and just didn’t realize it.

At Dell Technologies, we have a long history of working with retailers to provide IT systems for their stores. Traditionally, a store might have a few servers and a few more point-of-sale systems. Retailers are now gearing up to deploy entire racks of IT gear to power in-store experiences, improve theft and loss prevention, adjust pricing via dynamic in-store operations, automate stocking, etc. Manufacturers are similarly deploying gear to automate hundreds of tasks and operations throughout factory floors.

I get asked a lot about whether the third premises (or edge) is real, likely spurred by people who have been burned by expectations around consumer IoT. It is very real, and it is not a sci-fi future state, it is everywhere around you already.

Preparing for a decentralized world will be critical to long-term business continuity and human progress. Digital transformation will continue to move out of traditional data centers to the cloud and the edge. Having the capacity to ingest, manage and connect data and experiences will be crucial.

In Close Encounters of the Third Kind, Richard Dreyfus’ character was asked what he wanted. “I just want to know that it’s really happening.” The third premises are here. As the iconic little boy in the movie said, “You can come and play now.”

Source: delltechnologies.com

Tuesday 26 October 2021

Rethinking Security for the Edge

Dell EMC Study Materials, Dell EMC Preparation, Dell EMC Guides, Dell EMC Career, Dell EMC Jobs, Dell EMC Skills

Edge computing is driven and defined by a massive proliferation of devices generating, capturing and consuming data outside the traditional data center. Capitalizing on this data is critical for maintaining competitiveness in the data decade but securing an exponentially expanding attack surface presents even greater challenges for IT and security staff.

We have seen attacks at the edge increase over the last few years in locations such as retail stores, gas stations and utilities. These attacks are often either cyberattacks that originate from a distance, or physical “man in the middle” attacks in which a physical device intercepts data from legitimate devices, such as gas pumps or registers. This shows not only the increasing need for intrinsic security at the edge, but also the new ways we will need to prepare for and defend against vulnerabilities.

Detecting a cyberattack in the data center can be like finding a needle in a haystack. And a compute environment that spans multiple near and far edge locations and many thousands of devices creates a lot of haystacks to hide needles in. Using a layered approach to protect infrastructure and data at the edge can lower risks and enable you to deploy edge use cases with more confidence.

It’s important to remember when creating a strategy for protecting data at the edge that security considerations typically fall into three categories: the physical layer, the operational layer and the application layer.

Dell EMC Study Materials, Dell EMC Preparation, Dell EMC Guides, Dell EMC Career, Dell EMC Jobs, Dell EMC Skills

Physical layer


Data centers are built for physical security, with a set of policies and protocols designed to prevent unauthorized access and avoid physical damage to, or loss of, IT infrastructure and data stored on these systems. At the edge, servers and other IT infrastructure are likely to be housed in a utility cabinet, under a desk or at a remote location with no regular IT staff on-site. That makes securing physical equipment even more critical at the edge.

Points that you need to consider for physical security at the edge include:

◉ Controlling infrastructure and devices throughout the end-to-end lifecycle, from the supply chain and factory to operation to disposition.
◉ Preventing systems from being altered or accessed without permission.
◉ Protecting vulnerable access points, such as open ports, from bad actors.
◉ Preventing data loss if a device or system is stolen or tampered with.

Operational layer


Beyond physical security, IT infrastructure is subject to another set of vulnerabilities once it is in operation at the edge. In the data center, infrastructure is deployed and operated under a set of tightly controlled circumstances, and data is protected behind the corporate firewall. However, edge environments tend to lack dedicated IT staff, and servers and devices are often deployed by nontechnical personnel. The vast number of devices being deployed, and lack of visibility to these devices, makes securing IT in operation more challenging than in a centralized environment.

Points to consider for securing IT in operation at the edge include:

◉ Ensuring a secure boot spin up for infrastructure with an uncompromised image.
◉ Controlling access to the system, such as locking down ports to avoid physical access.
◉ Installing applications into a known secure environment.

Application layer


Once you get to the application layer, data protection looks a lot like traditional data center security. However, the high amount of data transfer combined with the large number of endpoints inherent in edge computing open up points of attack as data travels between the far and near edge to the cloud and main data center and back.

Considerations for application security at the edge include:

◉ Securing external connection points.
◉ Identifying and locking down exposures related to backup and replication.
◉ Allowing application traffic from known resources only.

Bring intrinsic security to the edge


When it comes to security at the edge, compromise is unthinkable, and cybercriminals are everywhere. Edge deployments require special focus to provide the same level of robust security as the traditional data center. Dell Technologies takes a holistic approach that builds in security with automation to streamline operations across your core to cloud to edge environments.

Source: delltechnologies.com

Sunday 24 October 2021

Looking back at Twenty Years of OEM Solutions

Dell EMC Study Materials, Dell EMC Career, Dell EMC Guides, Dell EMC Preparation, Dell EMC

This year we’re celebrating the 20th anniversary of Dell Technologies OEM Solutions division. This milestone is particularly special for me as I also celebrate one year leading this innovative team.

Way back in 2000 – when we had to print out directions for a road trip, and never would have imagined receiving a package just hours after ordering – the OEM Solutions division formed to meet customer demand for hardware that didn’t yet exist. These customers came to Dell because they needed equipment that was very close to products we were already selling but specialized to meet the requirements of projects they had in the works – like industrial-grade solutions able to withstand harsh factory environments or equipment that would be installed in remote, offshore locations. That’s how our work began – adding some extra battery life here, ruggedizing a server there, creating unique and custom-made solutions for every customer.

Today, our OEM customers are everywhere. In health care, they’re pioneering genomic research and developing electronic health records systems. In telco, they’re building out 5G networks. In transportation, they’re automating ports and developing new ways to track shipments. We’re driving innovation at the edge, and our ruggedized servers are the backbone of systems from remote MRI machines to systems and equipment monitoring across industries.

From those first ad-hoc customer requests 20 years ago, we’ve grown our OEM business into a multi-billion-dollar global powerhouse that serves more than 40 vertical industry specialties. Our team of more than 700 professionals customizes, designs, industrializes, transforms and innovates on behalf of our customers so they can focus on delivering the best solutions to end-users.

As we reflect on what we have achieved, here are a few key milestones for the Dell Technologies OEM Solutions group:

◉ 2000: The team developed and launched OEM Solutions shortly after the birth of the business, including OEM Ready, which gives customers the ability to rebrand Dell Technologies’ solutions or opt for a brandless experience. The team launched program management and support services in 2003, which included dedicated teams of engineers and OEM specialists to serve customers’ unique needs.

◉ 2010: Dell Technologies OEM Solutions established a channel partner program, which expanded to Europe and Asia in 2012 and 2013. Today, the Dell Technologies Partner Program empowers and rewards partners for taking their customers’ fully customized solutions to market. Our momentum is strong, growing OEM orders revenue through the channel by 45% year-over-year in the first half of this fiscal year.

◉ 2014-2017: Dell Technologies OEM Solutions introduced enterprise-class edge technology including the first OEM PowerEdge server platform specifically designed to thrive in space-constrained, rugged and harsh environments. This edge portfolio recognized the need for tailored solutions that collect and process data at the edge, serving as a foundation for future OEM product development.

◉ 2018: Dell Technologies OEM Solutions delivered our first 1U rack workstation built to anticipate the needs of customers who required powerful performance in the tightest spots at the tightest edge.

◉ 2021: Our innovation engine is stronger than ever, and it’s showing in the depth and breadth of our customers. In the first half of this fiscal year, we increased new customer accounts by 32% over the same period last fiscal year. Whether we are providing specialized infrastructure, software or services solutions, we continue to meet the current and future needs of our customers in this “do-anything-from-anywhere-world.”

As we look forward to the next 20 years, we know innovation and global execution will continue to be key. Our future is unfolding at the edge and our OEM customers and partners are powering an ever-changing marketplace. I cannot wait to see the impact we will make together next.

Source: delltechnologies.com

Saturday 23 October 2021

Can Private Wireless Networks Shape Tomorrow’s Enterprise?

Dell EMC Study Materials, Dell EMC Certification, Dell EMC Preparation, Dell EMC Guides, Dell EMC Career

While Mobile 2G and 3G were standards designed to enable and enhance mobile users’ broadband, its inherent technical limitations, and the absence of a consolidated enterprise ecosystem somehow limited private wireless networks’ (PWNs) adoption by enterprises.

High latency, low throughput, and a somehow restricted flexibility to scale prevented enterprises from using mobile communications networks as an integral connectivity component for their operational machinery.

During the last 10 years, with the deployment of 4G networks around the world, enterprises for the first time found a viable connectivity solution for their demanding requirements – the enterprise segment understood how powerful it would be introducing high-performance connectivity as a complement for their information technologies (IT) and operations technology (OT) infrastructures.

Fast forward and the entire world is talking about 5G

5G has been designed since inception to respond to stringent requirements from the enterprise segment. Cloud-native, low latency and high throughput are some of the attributes that have refreshed enterprise’s interest in private wireless networks. The exponentially growing amount of data generated by enterprises and end-users can be wirelessly and flexibly collected at the edge of the network and transformed into intelligent business insights thanks to 5G connectivity.

What are private wireless networks?

Private wireless networks are dedicated local on-premises networks designed to cover a specific location, site or premises (e.g., port, factory, warehouse, mine, shopping mall, industrial or educational campus) using a pre-allocated spectrum.

PWNs can have dedicated radio, core, and management functions that can either run on the enterprise’s dedicated infrastructure or be leased from a mobile network operator (MNO) or third party.

Private wireless and edge cloud – better together

Private 5G and edge cloud are considered the main catalysts for the 4th industrial revolution. The private wireless radio loop will connect sensors, machines and processes at speed – the edge cloud will transform these data into actionable intelligence.

Although it is possible to deploy private wireless networks independently or even without edge cloud, enterprises have exponential benefits combining private wireless with the ability to compute at the edge. The shorter loop from the sensor to the computing elements where data is transformed into insight enables a plethora of innovative use cases that are impossible to deliver with a public centralized cloud architecture, given the inherent higher latency associated with this topology.

Private wireless spectrum

Spectrum is a scarce resource and requires meticulous criteria for allocation, and telecom regulators have recognized enterprises’ needs to access spectrum resources directly.

The introduction of 5G enables the use of additional frequency bands, both licensed and unlicensed. Spectrum assignment procedures are evolving towards less deterministic methodologies, allowing dynamic allocations on a per use needs base. In the future, we will see spectrum being centrally managed and dynamically allocated following a set of predefined hierarchy access rules – this will enable a better spectrum utilization factor and open the way for enterprises to build and manage their networks.

The private wireless dilemma

According to Analysis Mason, more than three-fourths of all enterprises intend to deploy private wireless networks before 2024. Enterprises feel the pressure and understand the need to embrace technology to compete and survive. But deploying, operating and maintaining PWNs are tasks that most of the enterprises are not immediately ready to handle. They also want to find operational synergies with their existing IT operational model and be able to determine the end-to-end cost benefit relationship. Being able to respond to these questions will require enterprises to break organizational silos, streamline operational processes and elaborate a consistent partnership strategy.

Who do you partner with?

Ultimately PWNs adoption will be driven by the need for enterprises to improve processes, productivity and quality to compete in a new era of data-driven business decisions. Thanks to edge computing and private wireless networks, a potent human augmentation infrastructure is forming at the edge of the network allowing a plethora of new possibilities.

At Dell Technologies we strive to be the partner of choice for CSPs and enterprises to accelerate business transformation.

Our open disaggregated telecom solutions portfolio enables the formation of a rich partnership ecosystem with de facto industry leaders and zero lockdowns. Our innovative offer schemes are tailored for business visibility and reduced complexity.

From core, to edge, to RAN, to any cloud, our secure global supply chain can meet your technology demands at any scale.

Source: delltechnologies.com

Thursday 21 October 2021

Where and What is the Edge?

Dell EMC Study, Dell EMC Certification, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning, Dell EMC Study Material

Remember Big Data? Even the name seemed bigger than life, because it was usually capitalized. When people first started talking about it, we had the “three Vs of Big Data,” which were volume, velocity and variety. Then the “five Vs,” adding veracity and value to the prior three. Then the “seven Vs,” with variability and visualization added. While writing this, I found an “eighth V”: viability. Today, the “Big Data” term is rarely used, but its legacy lives on because much of what we learned from Big Data analytics we can apply to help business leaders solve problems, gather new insights and gain a competitive advantage using data from their edge.

While Big Data has become simply “data” and is part of the fabric of every enterprise, edge computing is enabling low-latency analysis of high volume, high velocity, high value data ingested and processed near its source. A big plus is that organizations can maintain governance and compliance with data locality and privacy.

Defining your edge

Dell EMC Study, Dell EMC Certification, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning, Dell EMC Study Material
At the intersection of edge computing and business needs is an incredible opportunity for enterprises to act on data near the point of creation to create immediate, essential value. But in the age of IoT, LTE and 5G, where — or what — is your edge?

Most definitions of edge computing agree that it refers to moving compute closer to data sources, although the specifics can vary wildly. But even that definition is a bit ambiguous, as your edge could be just down the hall from the core data center, in a hospital room or flying around the world on a jumbo jet.

The bottom line is that the definition of the edge is not as important as what use case you are trying to enable to accomplish a business outcome. No matter what you call it or where you put it, what determines the right approach to edge computing is specific to your industry and even to your organization.

For example, healthcare organizations may wish to build remote clinics to serve customers in rural areas. This requires the ability to process data close to the source to provide clinicians with the near real-time information they need to provide excellent patient care. Edge computing for healthcare could include analytics systems capable of processing data from patient monitoring systems or solutions that process, consolidate and transfer insights used for population health tracking. Because the solutions will be deployed at remote locations with limited IT staff, they need to be compact and simple to deploy and operate, with intrinsic security, advanced automation and remote management capabilities.

As another example, an airline may want to deploy sensors that can detect impending equipment issues and proactively notify maintenance so parts can be replaced without delaying the next flight. This predictive maintenance capability requires a streaming data solution that can pull data from IoT sensors, analyze it and provide maintenance recommendations based on anomalies in metrics such as stress, vibration and heat.

Bringing in external information, such as data about weather and other environmental conditions, increases accuracy and predictive capabilities. Because large amounts of data need to be created, processed, stored and transferred from planes to edge compute systems to a centralized data center and back, solutions need to offer streaming data analytics capabilities, powerful compute, ample, cost-effective storage and fast data transfer speeds.

While all of these examples describe edge compute solutions, they are so different in design and outcome that lumping them under the common term “edge” seems like a misnomer. The IT infrastructure footprint, network, input sources, security, data storage and protection, and architectural considerations will all be driven by several things. These include the characteristics of the data that must be analyzed at the edge, how fast insights must be derived to align to the desired business outcome and what portion of that data or metadata will need to be sent back to a centralized environment.

One edge need may involve continuous streaming data from sensors to identify anomalies such as fraud, security risks or quality degradation. Another may include ad hoc analysis of infrequent data inputs. Because nearly every technology vendor claims to have an offering for the edge, it can be difficult to weed out providers that offer a one-size-fits-all approach to edge computing.

To avoid trying to fit a square peg into a round hole, look for a partner that can help you identify and implement a solution with outcome-based benefits versus a technology vendor just trying to sell you a product. In addition, make sure your edge technology partner can provide a comprehensive selection of the core components of edge solutions, including compute, storage, networking, cloud, applications, automation, orchestration, security, analytics, support, services and software-defined management to suit your use cases.

Source: delltechnologies.com

Tuesday 19 October 2021

Delivering AI at the Edge

Dell EMC Study, Dell EMC Preparation, Dell EMC Career, Dell EMC Exam Preparation, Dell EMC Certification, Dell EMC Prep

The edge is not a new place, but it is garnering lots of attention, especially when it comes to Artificial Intelligence (AI). In fact, AI is the number one workload for the edge, according to Moor Insights & Strategy in the newly published paper, “Delivering the AI-Enabled Edge with Dell Technologies.” The paper also points out that numerous organizations across all industries are extending the reach of their IT infrastructures to the edge, with many of them being directed from the top down.

Edge Dynamics and Requirements

As we move further into a world of autonomous operations, whether it be with self-driving vehicles or automated manufacturing lines, the combination of AI and the edge, along with a robust and scalable enterprise IT infrastructure, is required to bring full automation to fruition. However, the pairing of AI and edge adds new levels of complexity, with many edge devices and endpoints being located in less secure environments, where devices will need to be hardened against weather and theft.

Deploying edge devices to remote locations should not be an afterthought. An edge solution ideally will be fully integrated with an organization’s overall IT infrastructure and thus be as enterprise-grade as the rest of the data center. Although the requirements of AI at the edge can be exceedingly complex, for the most part they mirror the capabilities of traditional IT infrastructure.

◉ Environmental – Edge solutions will need to be adaptable to many diverse environments.

◉ Connectivity – Edge applications need to be ready to be disconnected from traditional networks.

◉ Manageability – Edge solutions deployed at scale will require new intelligent management functions.

◉ Reliability – Edge devices need to be able to function in all types of potentially challenging scenarios making them reliable in all locations.

◉ Security – Edge solutions located outside the confines of traditional IT walls require intrinsic and comprehensive security.

Of course, organizations can deploy IT infrastructure capable of encompassing edge locations, but the most successful ones realize the value of strong technology partnerships in working with complex, emerging technologies such as AI at the edge.

The AI-enabled Edge

The proliferation of artificial intelligence continues apace across the spectrum and now the edge appears to be the last frontier for the development of a more autonomous environment. At Dell Technologies, we work with customers around the world on their AI initiatives. We are a leader in AI, driving innovations with it in our products, building the infrastructure to run AI workloads and deploying AI across our operations to streamline processes. Our deep expertise and our broad portfolio allow us to provide significant value to organizations in their digital transformations.

Dell Technologies offers a full range of integrated solutions, compute, storage and networking solutions ready for AI at the edge, including:

◉ Integrated solutions – Validated Designs for AI that include the compute, networking, storage, software, and services optimized for AI workloads.

◉ Compute – From laptops and workstations to servers and HCI solutions.

◉ Storage – Data at the edge requires enterprise grade, scalable storage, and management.

◉ Networking – A full slate of network solutions and switches to integrate edge devices into existing enterprise networks.

Our portfolio of integrated solutions for the core to the edge to the cloud is built with industry leading compute and storage. These solutions are fully supplemented with manageability, security, flexible deployment and consumption-based models, all backed by world class service and support.

Source: delltechnologies.com

Sunday 17 October 2021

Simplifying AI at the University of Pisa

Dell EMC Study, Dell EMC Preparation, Dell EMC Career, Dell EMC Certification, Dell EMC, Dell EMC Guides, Dell EMC Learning

In today’s environments, IT organizations need to support growing numbers of artificial intelligence applications, along with many other new and emerging high performance computing workloads. This mission creates challenges that aren’t necessarily solved with conventional IT technologies and approaches. In a typical scenario, IT administrators set up separate systems to accommodate AI workloads, and then manage many aspects of those systems with time-intensive manual processes.

Today, there is a better way forward — thanks to the integration of technologies from VMware, NVIDIA and Dell Technologies. With NVIDIA and VMware collaborating on software, IT organizations can run traditional and AI workloads in the same environment and on the same systems. This makes life easier for the people tasked with delivering the information technology to support emerging workloads. AI applications can now be managed with the same VMware flexibility as with other applications.

In addition, with the tight integration of VMware and NVIDIA offerings, organizations can now virtualize multiple technologies inside their systems. For example, they can virtualize and share the GPUs inside servers to enable multiple data scientists to simultaneously accelerate their deep learning workloads. This increases utilization rates and saves money that would have otherwise been spent on the procurement and management of additional hardware.

This integration of technologies also helps IT organizations save time and administrative steps. And perhaps best of all, they can manage everything centrally with VMware vCenter and easily allocate resources as needed.

These are the kinds of benefits that the University of Pisa is realizing today with its use of Dell Technologies with VMware and the NVIDIA AI Enterprise software suite.

Solution highlights

◉ Dell EMC VxRail provides a simple, cost effective hyperconverged infrastructure that solves a wide range of IT challenges and supports almost any use case, including tier-one applications and mixed workloads. VxRail enables faster, better and simpler delivery of VMware-virtualized applications. For a seamless, curated and optimized HCI experience, VxRail is engineered jointly by Dell Technologies and VMware. VxRail is also an NVIDIA-Certified System, which means it has been validated to provide excellent performance, security, and scalability for AI and data science workloads.

◉ Dell EMC PowerScale storage is designed to serve as the foundation for data, building an integrated and optimized IT Infrastructure for AI initiatives, from proof of concept (POC) to production. These all-flash scale-out network-attached storage solutions deliver the analytics performance and extreme concurrency at scale to consistently feed data-hungry deep learning algorithms. And combined with PowerScale OneFS governance and enterprise features for data management, data security, data compliance and data protection, PowerScale storage helps IT organizations conform to regulatory and enterprise security policy requirements.

◉ NVIDIA AI Enterprise is a software suite of enterprise-grade AI tools and frameworks that is optimized, certified and supported by NVIDIA with the latest VMware vSphere. With this software, IT professionals at the hundreds of thousands of enterprises that use vSphere for compute virtualization can now support AI with the same tools they use to manage large-scale data centers and hybrid cloud environments. NVIDIA AI Enterprise provides scale-out, multi-node AI application performance on vSphere that is indistinguishable from bare-metal servers.

◉ VMware vSphere is the industry’s leading server virtualization software for applications using any combination of virtual machines, containers and Kubernetes. Rearchitected with native Kubernetes, you can now modernize the 70+ million workloads running on vSphere. And now, you can run modern, containerized applications alongside existing enterprise applications on existing infrastructure with vSphere with Tanzu.

A Center of Excellence

The University of Pisa is both a Dell Technologies and a VMware AI Center of Excellence. As part of this designation, the University’s IT team regularly evaluates and tests new technologies. That’s the case today with the combination of Dell Technologies with VMware and NVIDIA AI Enterprise.

“We are running AI workloads on top of VMware, and we are using Dell EMC PowerStore for the storage in this virtualized environment,” explains Maurizio Davini, chief technology officer for the University of Pisa. “And we have a Dell EMC PowerScale all-flash environment for AI and HPC as a sort of traditional scale-out in our fast systems.”

Davini notes that his organization has both bare-metal and virtualized systems with NVIDIA GPUs and DPUs.

“We have traditional bare-metal GPUs, which are typically used for research like language processing, image processing, deep learning, deep neural network research and so on,” he says. “So we are still increasing our bare-metal capabilities on GPUs. And we now also have clusters of GPUs inside our VMware production environment which match the performance level of our bare metal systems.”

Flexibility is the key here.

“VMware gives us the possibility to be flexible and to use the infrastructure for a lot of things — enterprise workloads, VDI, remote workstations, support for smart working, scientific computing, HPC — all in the same infrastructure in a very flexible way,” he says. “And this is the problem that VMware and Dell have helped us to solve.”

For the full story, see the Dell Technologies case study “Simplifying AI systems.”

Source: delltechnologies.com

Saturday 16 October 2021

Unify IT, OT, and Business for Smart Manufacturing Outcomes

Dell EMC Study Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Certification, Dell EMC, Dell EMC Certification

Industrial progress has been at the center of human advancement for centuries. Three industrial revolutions brought about major societal transformations:

◉ Capturing the power of water and steam to mechanize manufacturing.

◉ Using electricity to create mass-produced consumer goods.

◉ Leveraging electronic systems to enable robotics and automation.

Today, we are at the threshold of the fourth industrial revolution, referred to by many as Industry 4.0 —  Industry 4.0 is being accelerated by the emergence of sophisticated and powerful edge technologies that enable you to act on data near the point of creation to create immediate, essential value.

In an era when data is revolutionizing manufacturing, the ability to collect, analyze and act upon time-critical data is a game-changer. Today’s smart manufacturing plants use powerful technologies like industrial Internet of Things (IIoT), artificial intelligence (AI), machine learning (ML) and streaming data analytics to get real-time insights that enhance business agility with better, faster decisions. These can help you increase overall equipment effectiveness (OEE), perform predictive maintenance, enhance production quality and optimize product yields.

However, the fast ramp to smart manufacturing is fraught with challenges. It requires connecting and managing disparate devices and equipment, merging data streams from multiple sources, building real-time insights and scaling efforts across sites. You need to mitigate risks to security and reliability, and in many cases, it all must happen in remote, harsh and/or space-constrained environments.

Simplify your path to smart manufacturing with a validated, end-to-end edge solution

The Dell Technologies Validated Design for Manufacturing Edge with Litmus is a validated, end-to-end solution designed to centrally manage and orchestrate IIoT and other edge devices, data and applications from the factory floor to the enterprise cloud. The solution is built in partnership with Litmus to accelerate smart manufacturing outcomes.

The Litmus platform built into the solution helps you simplify deployment and integration from the edge to multi-cloud environments, with features such as out-of-the-box connectivity to hundreds of modern and legacy industrial assets, support for containerized applications and cross-factory duplication for faster time to value.

You can gain business agility with live insights using supercharged data persistence at the edge using pre-built and custom data visualizations and AI-/ML-trained models — building a complete data picture from operational technology (OT) to IT, for better and faster decisions on the factory floor and in the boardroom.

This solution combines several proven technologies to win with edge computing in manufacturing. The solution is offered in three different configurations, sized based on workloads, each with the flexibility of subscription-based or CapEx consumption model.

Dell EMC Study Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Certification, Dell EMC, Dell EMC Certification

Dell Technologies Validated Design for Manufacturing Edge with Litmus


Foundationally, the Dell Technologies Validated Design for Manufacturing Edge with Litmus uses the award-winning hyperconverged infrastructure (HCI) Dell EMC VxRail, which is purpose-built for edge computing in manufacturing to provide high-availability, compute acceleration, AI/ML readiness, low‑latency storage and high-speed connectivity capabilities.

These solutions can thus scale up to any number of assets or sites with ease, resilience and security while centralizing the management and orchestration of your entire edge computing infrastructure at a global scale. For those who don’t need the high availability, Dell Technologies offers configuration options using Dell EMC PowerEdge servers.

The Dell Technologies Validated Design for Manufacturing Edge with Litmus comes integrated with the Dell EMC Streaming Data Platform (SDP) with advanced capabilities to ingest, process and store large streams of data at the edge. It bolsters the data connectivity in the solution with high-speed data persistence and unconstrained storage at the edge. This provides manufacturers the option to run ML training models at the edge using limitless playback of historical data.

With SDP, edge data is leveraged at the point of creation/consumption, thereby not needing to go to the cloud. This helps improve the latency and security of the real-time operational data for edge computing in manufacturing. As the data storage in SDP reaches its limit, you can simply add more storage, without any maximum limit.

Dell EMC Study Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Certification, Dell EMC, Dell EMC Certification

Your edge computing in the manufacturing environment can be further expanded with the latest Dell Technologies portfolio of edge-ready offerings, including Edge Gateway devices, Latitude Rugged Tablets and PCs and Dell EMC PowerEdge servers.

How will you use the Dell Technologies Validated Design for Manufacturing Edge with Litmus?


The Dell Technologies Validated Design for Manufacturing Edge with Litmus gives you the operational intelligence to transform silos of people, assets and processes into a data-driven, synchronized plan that gives OT unmatched visibility into actionable key performance indicators (KPIs). This lends itself well to support several edge computing in manufacturing uses cases that are rooted in business outcomes. Let’s look at how this solution helps support these use cases.

Overall equipment effectiveness


OEE measures manufacturing productivity based on the metrics of quality, performance and availability.  This solution helps reduce unplanned downtime while maximizing asset utilization and product quality by using diagnostics and prescriptive analytics to enable intelligent asset optimization. It has a pre-built OEE KPI that makes it simple to gather the right data, visualize it and use it to make improvements.

Predictive maintenance


Predictive maintenance is the practice of using analytics to detect possible equipment and process defects so you can address them before they cause a failure. ML is key to enabling predictive maintenance. This solution collects data from assets, IIoT sensors and machine runtime, normalizes it, monitors it and then triggers a maintenance request when thresholds are reached.

The solution also applies statistical calculations (for example, anomaly detection and signal prediction) to identify and alert to behaviors that indicate impending failure. You can also send the data to the cloud or enterprise platform of your choice for deeper analytics and the application of ML. By leveraging the built-in SDP, you can leverage high-speed data persistence and unconstrained data storage at the edge to train the ML models at the edge without needing to send data to the public cloud. This is a critical capability for OT teams that do not want to move operational data between the factory floor and public cloud due to data latency and security concerns. Using ML, this solution will help uncover previously unknown failure mode patterns from large historical data sets. Trained models are then deployed at the edge to notify both staff and machines.

Yield optimization


Yield optimization refers to predicting production issues before they occur, minimizing downtime of equipment and processes, analyzing the optimal flow of a product throughout its assembly process and detecting anomalies before they result in production pauses. This solution empowers you with predictive analytics and intelligent asset optimization to anticipate interruptions and intervene in real-time. It leverages IIoT sensors, machines, systems and people to help maximize performance and quality while minimizing outages and losses.

Production quality


Production quality is the practice of evaluating the production process to ensure manufactured goods meet quality standards. This solution can help you speed and automate the inspection of work in process throughout the entire production cycle, allowing you to run your computer vision applications (as Docker® applications) to detect defects in products, parts, or packaging to improve safety, decrease liability and keep your customers coming back for more.

Enable smart manufacturing outcomes with Dell Technologies Validated Design for Manufacturing Edge with Litmus


Powered by new technologies for capturing and analyzing data at the edge for new insights and efficiencies, there’s no doubt that the fourth industrial revolution will transform the manufacturing industry, reducing downtime, improving product quality and increasing production capabilities. But like the three revolutions preceding it, the fourth will impact more than just the manufacturing plant. Industry 4.0 won’t just revolutionize the factory floor — it will revolutionize the world.

The Dell Technologies Validated Design for Manufacturing Edge with Litmus simplifies edge deployments with complete visibility for both OT and IT using live insights powered by AI/ML. It fosters IT and OT collaboration on manufacturing outcomes stipulated by the business around OEE, yield optimization, predictive maintenance, asset utilization and more.

Source: delltechnologies.com

Thursday 14 October 2021

Simplifying Your Edge to Thrive in the Data Decade

Dell EMC Study Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Exam, Dell EMC Prep, Dell EMC Guides, Dell EMC

Edge computing is not new.

Businesses across diverse industries have been collecting data and deploying IT outside of traditional data centers for many years. But how many of them are satisfied with the value they are deriving at the edge? And how many are truly ready for the data wave that is coming—driven by trends such as 5G, smart devices, internet of things (IoT) and high-speed connectivity?

Dell Technologies defines the ‘edge’ as where data is acted on near the point of creation to create immediate, essential value. Industry analysts predict that more than 50% of new IT infrastructure will be deployed at the edge by 2023. And the number of new operational processes deployed on edge infrastructure will grow from less than 20% today to over 90% by 2024.

Read More: DES-1B31: Dell EMC Elastic Cloud Storage Specialist Exam for Systems Administrator

The value of edge initiatives can provide for organizations is to drive real-time actionable insights that achieve business outcomes. Ultimately, these benefits will provide long-term strategic advantage & competitive advantages such as:

◉ Operational efficiencies

◉ Enhanced experiences

◉ Revenue generation

◉ Safety

◉ Sustainability

Most agree that edge computing needs to be an essential element of the data-driven enterprise. And for many organizations, the potential of data at the edge remains yet untapped, invisible and unused. The sheer volume of data and ability to make sense of the data to arrive at actionable insights in real-time is a challenge in itself. Add the complexity that can arise from deploying, managing and supporting the proliferation of infrastructure, and you can see why organizations need to take a new approach to edge computing.

Simplify your edge

Dell Technologies is committed to helping you simplify your edge. We do this in three ways:

◉ Consolidating and streamlining data management and operations as you expand. Expansion often comes at the price of simplicity and efficiency. Years of legacy IT contribute to sprawl, silos and complexity. Delivering value at the edge will require the ability to consolidate and simplify IT and operational technology (OT), freeing you to scale capabilities exponentially.

◉ Securing the operational environment at the edge. Compromise is unthinkable anywhere, and cybercriminals are everywhere. Edge deployments need to be first-class citizens when it comes to security and compliance. They need to have the same rigorous security features as that of infrastructure inside a traditional datacenter.

◉ Overcoming environmental and latency constraints. You need to ensure that the infrastructure deployed at the edge can withstand not only the physical environment but also the performance and latency requirements at the edge. This includes everything from the server and networking equipment in closets and on factory floors to the laptops, tablets and PCs in your teams’ hands.

Reliable, integrated, and secure edge solutions

We help customers strategically approach their overall edge strategy to assess their environment and create a plan so they can start generating business-driving insights faster with reliable, integrated and secure edge solutions. We are adding a variety of new products and solutions to our already extensive portfolio of edge solutions to help you capitalize on time-critical edge data.

IT built for the edge

Dell Technologies can help you get richer insights faster as you overcome environmental constraints at the edge, with devices and infrastructure designed to stand up to edge environments.

Dell EMC Study Materials, Dell EMC Career, Dell EMC Preparation, Dell EMC Exam, Dell EMC Prep, Dell EMC Guides, Dell EMC

Improve efficiency and speed time to actionable insights using the intelligent Dell EMC Edge Gateway. Powered by modern Intel processors, these ruggedized devices give you uninterrupted performance in harsh industrial environments. Additionally, the gateway helps you connect your OT/IT infrastructure, collect and analyze edge-generated data, providing intelligence, security and device management. Edge Gateways provide the long life expected for more industrialized, edge environments and offer customizable options that address unique, OT customer needs through our OEM Solutions group.

Provide the ultimate in field productivity with Dell Latitude 5430 Rugged and 7330 Rugged Extreme laptops. These lighter, smaller and more powerful notebooks are designed for improved remote connectivity and are now 5G enabled to provide faster communication and leverage 5G networks.

Address evolving IT needs outside the data center with Dell EMC PowerEdge Tower servers. These powerful, adaptive servers act as compute platforms. They are engineered to optimize the latest technologies and scale as needed. Their autonomous compute capabilities help you respond rapidly to business opportunities while the servers’ proactive resilience embeds trust from edge to core to cloud with an infrastructure designed for secure interactions and the capability to predict potential threats.

Keep critical data flowing and delivering value


Dell Technologies can help you streamline data management and operations, with solutions that consolidate and simplify IT and OT, freeing you to scale capabilities exponentially.

Enable a smaller footprint and multiple high-performance GPUs for modern edge workloads with the Dell EMC VxRail hyperconverged infrastructure (HCI) edge appliance. This new VxRail satellite node deployment option is ideal for customers running VxRail in the core data center and desiring centralized automated management and lifecycle management across core and edge. Dell also supports customization and branding of the VxRail hardware platform to fit unique customer needs through our OEM Solutions group.

Bring real-time analytics to the edge through the Dell EMC Streaming Data Platform, which is software that works as a data broker between edge devices and the cloud or core. The latest version now provides an expanded support of new hardware platforms such as Vxrail and PowerFlex while optimized for GPU to provide near real-time streaming video. These features combined with an unconstrained storage capability makes SDP a versatile platform for analytics from the edge to the multi-cloud.

Accelerate smart manufacturing outcomes with the Dell Technologies Validated Design for Manufacturing Edge with Litmus. This validated solution is designed to centrally manage and orchestrate industrial edge devices, data, and applications from the factory floor to the enterprise cloud.

Benefit from an end-to-end edge vision and global scale to deliver


Edge computing is critical for enterprises to be data-driven and competitive in the age of Industry 4.0. It cannot be achieved by piecemeal edge initiatives executed in silos. You need an end-to-end vision backed by a flexible technology that lets you start small and quickly ramp up at a global scale. With our latest launches, Dell Technologies is expanding its edge portfolio and continuing to bring the trusted experience needed to help you simplify your edge from the edge to the multi-cloud.

Source: delltechnologies.com

Saturday 9 October 2021

Living on the Edge

Dell EMC Study Materials, Dell EMC Preparation, Dell EMC Career, Dell EMC Tutorial and Materials, Dell EMC Certification

It’s middle of the night and I wake up feeling thirsty. Hearing a faint rhythm of a song I know from years ago, I walk towards the window holding a glass of water. I try to listen and voilà, it is indeed the familiar music of youth. 

It makes me smile to think how the meaning of this phrase ‘living on the edge’ has travelled the distance of a few light years. I now work in the information technology industry, where edge is the guiding light into everything intelligent, responsive, real time and predictive.  

The edge of today, as in edge computing, is not pushing my blood pressure up or making me flush with adrenalin. It is rather helping industries improve efficiency, safety and productivity.  

Read More: E20-562: VPLEX Specialist Exam for Systems Administrator (DECS-SA)

I read “Business at the Speed of Light” by Bill Gates long ago. The practical implications of that title are being seen now. While IoT is already proving to be a critical enabler on the factory floor, organizations are now looking to enhance the responsiveness of their manufacturing systems further. Manufacturers envision a future where factory equipment can make autonomous decisions based on what’s happening in real time. A future where they can more easily integrate all steps of the manufacturing process including design, manufacturing, supply chain and operations which will facilitate greater flexibility and reactivity when participating in competitive markets. Enabling this vision requires a combination of related technologies such as IoT, artificial intelligence, machine learning, and most importantly, edge computing.  

A simpler definition of edge is the end point where the action is happening. Edge computing is processing the data as close to the source as possible instead of the traditional process of collecting data using sensors from a manufacturing process and sending that data to a PLC (Programmable Logic Controller). 

Machines generate humongous amounts of data and that is fed into the cloud through bandwidth in order to analyze it, sending instructions back to actuators to adjust the process. There is another way to make use of IOT devices. Collect, analyze at the edge of the network and execute on real-time data without the bandwidth costs and the added expense of time that comes with sending that data offsite for analysis. It is the distributed framework where data is processed as close to the originating data source as possible. This has many advantages over cloud computing, most notably the reduction of latency and infrastructure costs. Overall, being at the scene of the action, edge improves data reliability while saving the costs of duplication. 

IoT involves collecting data from various sensors and devices and applying algorithms to the data to glean insights that deliver business benefits. Industries ranging from manufacturing, utility distribution, traffic management, retail, medical and even education are making use of the technology to improve customer satisfaction, reduce costs, improve security and operations and enrich the end user experience. 

Dell EMC Study Materials, Dell EMC Preparation, Dell EMC Career, Dell EMC Tutorial and Materials, Dell EMC Certification

Two drivers of edge technology today are high speed operations and processes that generate large amounts of data. High speed operations can create latency problems. Computing closer to the edge or directly on a machine reduces latency. Additionally, the device could operate independently, which may improve security by not having to access a large network.  

The best thing I have read about data processing was from Nick Fragale, co-founder of Rover Robotics, who said “run on the edge, train in the clouds.” I hear the same old familiar tune and wonder how the band performing the song in my head would react to the new meaning of what they once served up as the epitome of rebellion.

Source: delltechnologies.com

Thursday 7 October 2021

The Future of Software-defined Networking for Storage Connectivity

Dell EMC Study, Dell EMC Study Materials, Dell EMC Certification, Dell EMC Career, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning

A new business paradigm is driving the need for high-speed, software-defined storage connectivity. One in which modern multi-cloud applications use disaggregated infrastructure and where data moves on-demand to where it’s needed most, at the edge. NVMe over Fabrics are ideal for meeting the needs of these next generation workloads and those of existing applications and environments, by delivering improved performance, lowering latency, and improving efficiency.

Read More: DEA-1TT4: Dell EMC Information Storage Management (DECA-ISM)

As we’ve explored the various NVMe over Fabrics transport protocols, we’ve concluded that each is best suited for certain environments and use cases. For our initial release, we decided to focus on NVMe/TCP because it performs as well as NVMe/FC but scales to much higher speeds at significantly lower cost. Its biggest strength is its ability to leverage the standard networking infrastructure and the ongoing investments being made to Ethernet, especially in support of cloud connectivity that is constantly pushing the need for more bandwidth and lower latency.

Dell EMC Study, Dell EMC Study Materials, Dell EMC Certification, Dell EMC Career, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning
NVMe/TCP also does not require any specialized network configuration (i.e., to support lossless behavior), nor is it susceptible to congestion spreading. The biggest downside? NVMe-oF was difficult to manage at scale due to its reliance on a Direct Discovery (iSCSI-like) management model. To help address this limitation we have been working relentlessly with industry organizations like VMware to deliver Centralized Discovery to NVMe/TCP, because it allows storage connectivity to be efficiently automated at scale. In addition, we not only want to allow NVMe/TCP to be a drop-in alternative for FC, we want them to co-exist (run in parallel) with any other storage protocol making the transition to NVMe-oF TCP/IP simple for Enterprises.

To create this user experience, Dell continues to invest in NVMe IP SAN product capabilities and enhancements across the Dell Technologies’ portfolio of Storage, Networking and Compute including a new Centralized Discovery Controller called SmartFabric Storage Software (SFSS) that provides the intelligence for this automated experience.

Dell EMC Study, Dell EMC Study Materials, Dell EMC Certification, Dell EMC Career, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning

This new NVMe IP SAN Dell portfolio becomes available for use on November 18, 2021, and along with it, we are announcing the following key innovations: 

◉ SmartFabric Storage Software (SFSS) – Automates storage connectivity for your NVMe IP SAN. It allows host and storage interfaces to register with a Centralized Discovery Controller, enables storage administrators to create and activate zoning configurations and then automatically notifies hosts of new storage resources. Hosts will then automatically connect to these storage resources.

◉ PowerStore: NVMe/TCP protocol and SFSS integration – We have enabled our market leading Dell EMC PowerStore storage array to use the NVMe/TCP protocol and it will support both the Direct Discovery and Centralized Discovery management models. PowerStore integration with SFSS is initially accomplished via the Pull Registration technique.

◉ VMware ESXi 7.0u3: NVMe/TCP protocol and SFSS integration – We have partnered with VMware to add support for the NVMe/TCP protocol as well as the ability for each ESX server interface to explicitly register discovery information with SFSS via the Push Registration technique. We have also updated our OMNI plugin to support configuration of SFSS from within vCenter.

◉ PowerEdge: ESXi 7.0u3 using NVMe/TCP has been qualified.

◉ PowerSwitch and SmartFabric Services (SFS): While out NVMe IP SAN solution will run over traditional fabric switches, implementing Dell EMC PowerSwitch and SmartFabric Services (SFS) can be used to automate the configuration of the switches that make up your NVMe IP SAN. SFSS will eventually have an integration with SFS to create hardware enforced zones in future releases.

◉ PowerMax and PowerFlex (future): You can expect us to add support for NVMe/TCP to both our Dell EMC PowerMax and PowerFlex over the next couple of releases.

SmartFabric Storage Software

SFSS can provide the same fabric services as FC on an IP based network by combining several subservices to create the equivalent functionality.

Dell EMC Study, Dell EMC Study Materials, Dell EMC Certification, Dell EMC Career, Dell EMC Preparation, Dell EMC Guides, Dell EMC Learning

Finally, Dell, along with several other interested parties, know how important being able to boot from SAN is, and as a result we have been driving technical proposals intended to add support for this as well.

Performance – NVMe/TCP vs. iSCSI

As mentioned previously, we decided to implement NVMe/TCP first because we felt it was the best “general purpose” IP based storage protocol available. We also noticed that it could provide a tremendous performance boost when compared to iSCSI on ESX. For example, initial internal testing has provided the following results:

◉ IOPs – NVMe/TCP provided 2.5-3.5x the IOPs as iSCSI

◉ Latency – NVMe/TCP reduced latency by 70-75%

◉ CPU per IO – NVMe/TCP reduced CPU utilization by 40-50%

Ecosystem engagement

In addition to the work Dell has done to create NVMe standards that will ensure the NVMe IP SAN will remain open and interoperable, we’re also heavily investing in the NVMe-oF ecosystem in several important ways:

◉ Open Source CDC Client – Dell is currently the maintainer of the open source client that will be used by Linux distributions to support discovery automation. The client package (nvme-stas) will be available near the end of the year.

◉ Developer Enablement – Dell will facilitate the creation of a Github-based developer community that allows for sharing SFSS REST API documentation, a developer toolkit and useful scripts. There may even be a lite version of the SFSS itself, so developers can quickly get up to speed and start automating connectivity.

Dell Technologies is continuing our long history of helping to solve customer challenges with innovative solutions that are optimized, cost-effective, and easy to implement. It’s an exciting time as organizations small and large continue their journey of IT Transformation to help stay ahead of new demands on their infrastructures and services. These NVMe IP SAN innovations from Dell Technologies in storage connectivity automation are revolutionary and represent what we believe is the future of storage networking.

Source: delltechnologies.com