Saturday 30 May 2020

The Evolution of the Data Warehouse in the Big Data Era

About 20 years ago, I started my journey into data warehousing and business analytics. Over all these years, it’s been interesting to see the evolution of big data and data warehousing, driven by the rise of artificial intelligence and widespread adoption of Hadoop.

When I started in this work, the main business challenge was how to handle the explosion of data with ever-growing data sets and, most importantly, how to gain business intelligence in as close to real time as possible. The effort to solve these business challenges led the way for a ground-breaking architecture called Massively Parallel Processing (MPP), or data sharing across multiple commodity servers with Direct Attached Storage (DAS), which kept data close to processing power. This was a move away from traditional symmetric parallel processing, where data was stored in centralized storage and accessed over networks. This worked well – for a while!

Silos


The imminent “explosion of data” present in all analytics/AI discussions is the driver for the evolution of data warehousing and business analytics. The success of the old architecture is also the reason why it had to change. You must reshuffle data quite often to keep data close to compute (whilst you run queries), while increasing data size as well as concurrent applications and users’ access to the data.

This was often resolved by creating data silos, standing up a new infrastructure for new applications, as well as performance tuning. This created a multitude of business and operational issues. While you often deal with the same data sets, they are extracted, transformed, loaded (ETL) and maintained differently across various applications, which means you have different analytical windows into the data depending on which application you are using. So which data is the source of truth? From an operational point of view, the cost to maintain various copies of the data, secure it, and maintain SLAs becomes another challenge.

The figure below shows a typical analytics environment highlighting the complexity and data duplication.

Dell EMC Study Materials, Dell EMC Certification, Dell EMC Learning, Dell EMC Prep

Data required


The deluge of data mentioned above is almost entirely caused by IoT and edge devices, plus an explosion in individual devices and users of social media. This is not your standard canned, transactional, and nicely-formatted SQL data. Instead, it’s usually semi-unstructured data that can sometimes only be accessed by utilizing a schema-on-read approach — which wouldn’t work for traditional SQL Data Warehouses. Also, new data types are enabling (and, perhaps even forcing) the hands of businesses to look beyond a traditional analytics approach. Today, Hadoop is a large part of many organizations’ analytical capabilities, and AI is rapidly taking off, giving businesses 360 degree analytics capabilities, from descriptive all the way to prescriptive analytics. Traditional architectures wouldn’t be able to do this, at least not at scale.

Now what do we do?


IT infrastructure must evolve to meet these data demands. Modern analytics and business intelligence now encompass all aspects of business with a much larger, more complex and multi-faceted data set and application portfolio. How do we manage all the data and applications? Simple, the answer is a “true” data lake. Account for eliminating data silos, create a single accessible data source for all applications, and eliminate data movement/replication, so organizations can control compute and storage ratios to achieve better cost with the ability to leverage best-in-class elastic cloud storage and maintain open data formats without lock-in associated with proprietary data management solutions.

At Dell Technologies, we’re working closely with some of the leading software vendors in the data lake and data warehousing arena such as Dremio, YellowBrick, and Vertica to enable our customers to seamlessly build and deploy a scalable and future-proof IT infrastructure. Dell EMC Isilon and ECS provide best-of-breed and market-leading architecture that enables IT departments to use scale-out NAS and object storage technology that natively supports multiple protocols such as NFS, HDFS, SMB and S3. And all on the same hybrid file system offering cost effective performance, scaling and a complete set of enterprise governance features such as replication, security, backup and recovery.

The proposed architecture enables IT to eliminate multiple landing zones and ETL zones and the need to replicate data. Simply load the data files into the data lake and point your application to query data directly from the data lake. It’s that simple!

As an example, the figure below shows an open data lake architecture incorporating a data lake engine from Dremio and data lake storage via Dell EMC Isilon and ECS. Dremio delivers lightning-fast queries directly on ECS, offering a self-service semantic layer enabling data engineers to apply security and business meaning, while also enabling analysts and data scientists to explore data and derive new virtual datasets. Dremio ensures flexibility and openness and lets you avoid vendor lock-in, directly query data across clouds or on-prem, optionally join data with existing external databases and data warehouses, and keep your data in storage that you own and control.

Dell EMC Study Materials, Dell EMC Certification, Dell EMC Learning, Dell EMC Prep

Dell Technologies will continue partnering with key ISVs such as Dremio and Yellowbrick, to deliver AI and Data Analytics solutions that will allow our customers to power their journey to AI.

Thursday 28 May 2020

Delivering Innovation in the Data Era with ECS 3.5

The data decade is in full swing and it appears to be living up to the name.

Data growth continues to accelerate at a breakneck pace and the increased digitization of our daily lives only has a compounding effect on the rate of data creation. IDC’s projections bear this out: “within the next four years, the global economy will finally reach ‘digital supremacy,’ with more than half of GDP driven by products and services from digitally transformed enterprises.”

It has never been more critical that organizations support the applications, websites, cameras, home security systems, IoT sensors, and everything else we depend on in our daily lives. But in order to do so, companies must invest in storage technologies which empower them to capture all this data—the majority of which is unstructured in nature—and serve it up to their constituents at a moment’s notice.

Object storage systems are excellent candidates to do just that, and at Dell Technologies, we’ve got the best platform for the job. It’s called Dell EMC ECS.

We’ve made some innovative strides in the last few months to further enhance the value this object storage platform brings to organizations of all stripes. Our latest step on this journey of continuous innovation brings us to this release: ECS 3.5.

Dell EMC Study Materials, Dell EMC Tutorial and Material, Dell EMC Exam Prep

ECS 3.5 features compelling new capabilities such as:

◉ SSD for metadata read caching: Organizations looking to unlock more performance from their ECS appliances now have the option to include a 960GB flash drive per node. This improves system-wide read latency and IOPS, supporting analytics workloads which require faster reads of large data sets.

◉ Streamlined tech refresh functionality: We’ve made it even easier to migrate to the latest generation of ECS hardware via a non-disruptive upgrade process. By streamlining the evacuation of data from old nodes within a cluster, organizations can ensure continuity of operations while extending the life of their ECS investments.

◉ Customer replaceable drive upgrades (CRU): We’ve also enabled customers to take a DIY approach to replacing faulty drives. Now, with ECS 3.5, tech savvy IT teams don’t have to rely on an onsite Dell technician for break-fix, accelerating time-to-resolution.

◉ IAM support with object tagging: ECS 3.5 introduces support for identity and access management (IAM) with object tagging which provides IT departments more granular controls over resource access. Organizations can more easily incorporate company-defined user policies to improve their overall security posture.

◉ S3a support: ECS 3.5 also unlocks unique analytics and big data scenarios by enabling Hadoop data to be stored on ECS using the S3a protocol. This new capability has been certified for Cloudera’s QATS suite and integrates with IAM features to improve security for Hadoop workloads.

And there’s a lot more where that came from. With these latest enhancements, it’s never been a better time to think through how ECS can unlock the value of your data. Whether you want to stand up cost-effective archives, modernize existing apps to take advantage of the simplicity of our object architecture or support the development of cloud-native applications, the versatility and economics of ECS make it all possible on one single platform.

Source: dellemc.com

Tuesday 26 May 2020

Dell EMC SD-WAN Solution Powered by VMware: Offering Even More Modernization

Cloud, Networking, VMware, News, Dell EMC Study Materials, Dell EMC Guides, Dell EMC Exam Prep

Dell Technologies continues to innovate and deliver new solutions for customers as we fully accelerate to meet the Cloud era. With new partnerships with Google Cloud and new Dell Technologies Cloud Platform capabilities, Dell Technologies is helping to unify the application experience across products and technologies. That Cloud-based innovation extends to the Edge with SD-WAN.

Dell EMC SD-WAN Solution powered by VMware delivers a set of software-defined tools that can boost application performance and maintain quality of service for mission-critical applications like voice, video conferencing, and VDI. With Cloud-based centralized management, a Cloud network of virtualized gateways, and zero touch deployments, SD-WAN Solution delivers turnkey networking modernization and gives the WAN a much needed boost for supporting today’s Cloud applications. And with many businesses seeing traditional branch workloads now move into home offices for their critical workers, SD-WAN provides quality of service to keep business applications in service while sharing bandwidth with video streaming, social media, and other home traffic.

SD-WAN Solution powered by VMware modernizes networking with:

◉ Simplicity & Agility – Appliances and software combined in an all-in-one solution, saving you time and money and enabling the rapid deployment of SD-WAN

◉ Performance & Efficiency – Boost application performance with software-defined features and Cloud-based management, and gain efficiency to reduce WAN costs up 75%*

◉ Scale & Trust – Back your transformation at scale with enterprise-class support, services, and global supply chain from a single vendor- Dell Technologies

We’re pleased to announce that SD-WAN Solution has expanded to offer even more choice and flexibility for turnkey WAN modernization. Take a look at our recent updates:

Expanding the Edge


New Dell EMC Edge 620, 640, and 680 models join the successful Edge 600 appliance portfolio, adding even more configuration and bandwidth options for modernization. These appliances feature Intel processors, fast DDR memory, onboard flash storage, and are built specifically for virtualized networking workloads. And like all Dell EMC SD-WAN Solution appliances, they are factory-integrated with VMware Velocloud software, delivering a turnkey solution for transforming networking with SD-WAN. A closer look at the new appliances:

◉ Dell EMC Edge 620 – up to 500mbps of performance, for a single virtualized networking function and typical bandwidth usage
◉ Dell EMC Edge 640 – up to 1gps of performance with extra bandwidth capacity for demanding applications and resources
◉ Dell EMC Edge 680 – up to 2gps of performance, for multiple virtualized networking workloads and high bandwidth utilization

The expanded Edge portfolio brings even more throughput and performance to branch locations, all delivered in a sleek and compact form factor. End users simply plug the appliances in and IT handles the rest remotely via Cloud-based Centralized Management. No serial number tracking, no on-site visits, no sensitive network information pre-loaded on devices- a true game change!

More Software-Defined Innovations


VMware has expanded the software-defined features and capabilities of the Velocloud SD-WAN software.

The latest release, 3.4.1, introduces the following:

◉ Support for Private Segments
◉ Syslog Firewall Logging Enhancement
◉ MPLS CoS Enhancement

This new update expands upon the impressive list of features introduced in update 3.4, including:

◉ Conditional Backhaul
◉ Stateful Firewall
◉ Configure DSL Options
◉ Edge Clustering
◉ Wi-Fi Improvements
◉ Enterprise Reporting

VMware continues to push forward with SD-WAN, delivering an unmatched set of virtualized functionality to an already impressive set of software-defined innovations. It’s no wonder they’ve been named a leader by Gartner in the Magic Quadrant for Edge Infrastructure two years in a row.**

The Dell Technologies Advantage


With the full backing of Dell’s global supply chain, ProSupport, and PreDeploy services options, SD-WAN Solution makes the adoption and deployment of SD-WAN easy and low risk, no matter how complex the network or location. Select the Edge appliance that best fits your need – from the newly expanded 600 series to 3000 series options for data center and HQ locations – and ship directly to the site.

Network Engineers can then access the remote hardware via Centralized Management, and perform all network configuration tasks from there. No flights, no trips, no tracking of serial numbers or the risk of pre-loading devices with sensitive networking information. Deploying Dell EMC SD-WAN Solution powered by VMware is simple, fast, and low risk.

Purpose-built appliances from Dell Technologies and leading software from VMware combined in one innovative solution- a truly better together story to power modernization and accelerate networks to meet the challenges of the Cloud era.

Sunday 24 May 2020

Cyber Attacks – Will Your Backup Solution Fail You When You Need it Most?

In a matter of days, companies around the world showed incredible agility by moving from a limited remote workforce to a majority of their employees now working from home full time when their industries and work allowed. Because companies needed to act quickly, many established security standards were bypassed or hastily rushed through the process, increasing their exposure to cyber threats. Even the World Health Organization is warning against cyber attacks, citing a five-fold increase. This highlights that if you’re considering a data protection change, cyber recovery should be a key decision criteria that you validate before looking at other features and functionality. If the solution you choose does not allow you to quickly recover from all the different threats, including insider threats, then you could be putting your company at increased risk.

The first step is to establish your definition and goals for cyber recovery, depending on your company’s security needs and desired outcomes. In today’s data protection landscape, while it’s great that most vendors can provide strategies and features around cyber recovery, if they can’t help you accomplish your goals, then the solution is unusable. If I were evaluating a cyber recovery solution, I would ask these five questions to help understand which vendor can best support my definition and goals of cyber recovery and help me reach my long-term desired outcome.

1. Does the solution maintain a protected copy of data with physical and network isolation – a logical “air gap”?  If so, how is the air gap opened and closed? 


Without an isolated, logically air-gapped environment, data is vulnerable to threat factors. If the air gap is controlled from production, it’s vulnerable. If the air gap is opened and closed through a firewall or switch, it becomes a point of vulnerability, and the contents of the vault can be fully exposed when data is flowing.

Dell EMC PowerProtect Cyber Recovery provides a truly immutable, orchestrated, and  automated logical air gap to protect data. The isolated vault components are never accessible from production, and access to the vault target is extremely limited. Some vendors will claim to offer an “air gap” solution because their data is separate from the production network or sent offsite to tape or the cloud. However, these strategies remain fully accessible to bad actors and don’t provide sufficient protection to a wide variety of threat actors, or efficient recovery mechanisms.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Exam Prep

2. Does your backup software create or have access to the protected copy? Does your backup admin have access to the protected copy?


Some offerings in the market have data sets that reside in production and can be accessed by an administrator with appropriate credentials. An insider or advanced threat actor will likely have access to the backup server infrastructure. If that infrastructure includes access to the protected copy, it can be deleted or encrypted. How does this setup protect against a disgruntled employee looking to delete data or a threat actor who has stolen employee credentials from a previous attack?

PowerProtect Cyber Recovery has many layers to protect against advanced threat actors, including insiders. The vault itself is physically and logically isolated, and it cannot be accessed unless the person is physically in the vault. The vault also cannot be opened or controlled from the production side, so even an insider cannot gain access without physical access. Furthermore, in keeping with recommended best practices, only CISO-appointed administrators should have physical access to the vault. And this control can be further hardened through processes that – for example – require a second person when the vault is accessed to provide oversight.

3. Even if your solution protects data with “immutable” or “locked” storage, how do you protect it from an administrator? Do you require a secure time (NTP) source to protect the immutability? 


Securing data against unauthorized changes or deletion is a good hardening practice, but it’s limited. While some immutability or locking capabilities are safe from regular users, an administrator can often override them. Vendors may claim that backups can never be changed, but they never say how this magic is performed. What happens if someone pulls drives, formats the appliance, or gains admin credentials?

Others are dependent on a time source that relies heavily on a secure Network Time Protocol (NTP) server. NTP is used by many components of the customer’s environment, like VMware, network, storage, and backup. What happens if a cybercriminal or insider compromises that server and moves the NTP date ahead past the lock period? PowerProtect Cyber Recovery not only provides a compliance retention lock capability that is attested to comply with the SEC 17a-4(f)(ii) archival standard but also uses an internal clock to help protect against attacks on an often vulnerable NTP server.

4. Do your analytics look at full content or just metadata?


Most vendors only take a high-level view of the data and use analytics that looks for obvious signs of corruption based on metadata. Metadata-level corruption is not difficult to detect, and if a solution leverages this kind of analytics only, it will miss changes within the file.

PowerProtect Cyber Recovery analytics go well beyond metadata-only solutions.  The solution provides full-content based analytics, analyzing the file metadata, the document metadata, and the full content of the file itself – this is what sets PowerProtect Cyber Recovery apart from other vendors’ limited offerings. This solution uses more advanced capabilities such as the entropy (a measure of randomness) and similarity of the files. For example, a metadata-based analytics solution would not be able to determine that the contents of critical files have been encrypted if the file names have not been changed.

For added security, the analytics operates inside the vault, where an attacker cannot compromise them.  This also provides the ability to enable a more efficient recovery after an attack. Competing analytics solutions simply don’t offer this level of intelligence or security. If they determine a file is suspicious based on their metadata scan, to do a full analysis, they must then send the entire file offsite to a cloud provider, where a second pass on the file can be performed. During a ransomware scenario where minutes count, this is a problematic strategy. Similarly, in a large-scale attack, network access could be destroyed or intentionally shut down, so it may not be possible to send the file(s) or obtain the results. Knowing which files have been compromised and which are safe to recover is critical knowledge before recovering data back into production environments.

5. Does your solution rely on a hidden share or view?


This is a strategy that looks better on a marketing one-pager than in the data center. Even if the data sets are “hidden,” they are still accessible to administrators. What’s stopping an insider threat from deleting these “hidden” data sets?

With PowerProtect Cyber Recovery, IT or Backup administrators can’t access or override security credentials or retention policies in the vault.  You can rest assured knowing that you’re protected just as much from insider threats as from external ones.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Exam Prep

Hopefully, thinking through these five questions and asking the vendors you’re evaluating how they handle these scenarios will help you make an informed Cyber Recovery decision that meets your company goals.

Saturday 23 May 2020

Accelerate Innovation with Near-Instant Database Cloning

Dell EMC Study Materials, Dell EMC Certifications, Dell EMC Learning, Dell EMC Database

The demand for database cloning is growing exponentially as more and more companies invest in DevOps to bring new products and features to market faster. As a result, software developers, database admins, quality assurance, data scientists, analysts and operations teams all need fast access to production data copies of critical enterprise databases.

Size and speed


The challenge is that today’s databases are huge, ranging in size from 10 to 100 terabytes. Apart from obvious storage constraints, delay is also a factor. Moving such huge files, regardless of network speed, can take hours, sometimes even days. The data also needs to be refreshed regularly to make sure the copy is as close to the production version as possible.

Barriers to innovation


It will come as no surprise, then, that according to customers, it takes too long to provision copies of multi-terabyte production databases to test environments. Those who want to leverage the Cloud for test sandbox environments often struggle to execute on their vision.

Customers say that the current process is burdened by long development cycles, poor quality releases, and a general lack of self-service. Apart from slowing down development and testing, the financial impact to business is significant. The big question is – how can we remove these obstacles and accelerate the pace of innovation?

Software-defined, scalable and powerful


This is exactly why I’m excited to offer the Actifio Database Cloning Appliance (DCA), powered by Dell Technologies – an integrated appliance combining the power of Actifio’s copy data management software with our hyper-converged and software-defined platforms such as VxRail and VxFlex with Intel Xeon scalable processors.

Together, we’ve blended our IP to create an easy-to-run yet powerful platform, which allows you to transform database test cycles, increase application quality and compress database provisioning, particularly in large environments.

Customer reaction says it all


Dell EMC Study Materials, Dell EMC Certifications, Dell EMC Learning, Dell EMC Database
Sometimes, you can spot a winning solution straight off the bat. For me, the acid test is customer reaction. Their faces said it all when we explained how, using native database tools, they could easily spin up databases that were near-real time on Dell Storage and even use PowerProtect DD appliances when retention lock was needed for regulatory requirements like Financial Services.

I listened as Database Administrators responded to the demo, inundating us with questions and scenarios unique to their environments. I heard their excitement when they realized how they could leverage DCA in multiple creative ways to reduce development time, increase QA standards and drive more performance from mission-critical applications.

One-stop solution


If you’re in operations, software development, testing, data analysis or QA, this solution will be music to your ears. It delivers instant database clones of even the largest databases in minutes either locally or in your Cloud of choice with the solution entirely API-driven to enable automation and self-service data access.

Accelerate the pace of innovation in your business. Transform the speed and cost of managing complex database environments, while simplifying management and security.

Source: dellemc.com

Thursday 21 May 2020

Introducing the New Standard for File in the Cloud

File data has experienced rapid growth in the data center. In fact, it often accounts for nearly 50 percent of the on-prem data footprint for an organization. When it comes to the public cloud, adoption of file has been steady, but it is not as prevalent. This is a reflection of the state of the file services in the cloud today. As they become more mature and capable, more file-based workloads can be supported, and adoption will naturally accelerate.

This is why I’m excited to introduce as part of the Dell Technologies Cloud family OneFS for Google Cloud, a fully integrated and managed native cloud file service that combines the power and scale of OneFS with the economics and simplicity of Google Cloud. OneFS is the battle-hardened operating system of Dell EMC Isilon storage arrays, and the industry’s #1 scale-out NAS filesystem2.

The OneFS file service offers multi-protocol file access for NFS, SMB, and HDFS workloads. Capacity scales up to 50PB in a single namespace, with performance that scales along with it at a rate of 97 MB/s per TiB. Enterprise data management features include snapshots, native replication, Active Directory support, and high data durability and availability.

Organizations can now run high performance file workloads in the cloud to take advantage of Google Cloud’s elastic compute, GPU instances and analytics services – all without having to make any changes to their applications.

In addition, the OneFS file service is natively integrated with Google Cloud, making it simple and easy to use with predictable subscription-based pricing and guaranteed performance. Customers simply order it from the Google Cloud Marketplace, and once provisioned they can configure and manage their OneFS filesystems directly from the Google Cloud console. Customers receive a single monthly bill and support from Google while Dell Technologies experts manage the operations for you.

OneFS brings game-changing performance for file data in the public cloud


Based on third party performance testing with the IOzone benchmark, OneFS delivered sustainable read throughput of 200GB/s and write throughput of 120GB/s.

Now to put that in perspective, ESG compared these results to another NAS vendor offering file services in Google Cloud, and OneFS delivered: ​​

◉ up to 46X higher maximum read throughput​3
◉ up to 96X higher maximum write throughput​3
◉ up to 500x higher maximum file system capacity3​

Let’s take a closer look at the benchmark configuration and results.

Dell EMC Study Material, Dell EMC Guides, Dell EMC Certification, Dell EMC Exam Prep

FIGURE 1: IOZONE BENCHMARK CONFIGURATION

ESG began testing with the goal of measuring maximum sequential read and write throughput, running the industry standard IOzone benchmark. A 512K block size was used emulate the data processing characteristics of large file workflows such as high-definition video post-production, seismic data analysis, IoT, and genomic processing. The benchmark ran on Google compute instances, each of which mounted a single 2PB file share over NFSv3, which was sized at a 2PB usable capacity.

The number of compute cores were scaled from 64 threads, doubling the thread count with each load step, up to 1024 threads where peak sustainable load was achieved, as shown in Figure 2. These I/O threads were hosted by 128 virtual machines at the peak, each having 8 CPUs (n1-standard-8 instances).

Dell EMC Study Material, Dell EMC Guides, Dell EMC Certification, Dell EMC Exam Prep

FIGURE 2: ONEFS FOR GOOGLE CLOUD PERFORMANCE SCALABILITY BENCHMARK RESULTS

ESG validated that Dell Technologies Cloud OneFS for Google Cloud achieved a maximum read performance of 200 GB/s and maximum write performance of 120 GB/s against a 2PB storage volume1.

It is important to note that this is a scale-out filesystem, with linear scaling, which means if you want to double these performance numbers, double the size of the filesystem. OneFS can deliver massive throughput up to 945GB/s. ESG performed their benchmark runs with a 2PB configuration, and OneFS for Google Cloud scales up to 50PB. That’s the power of scale that we’re delivering!

You can now run your high-performance computing file workloads in the cloud – with an enterprise grade, scalable file service that can go the distance. The new standard for file data in the public cloud has been set. Imagine what that can do for accelerating your business.

Availability


OneFS for Google Cloud is available today. Customers in North America, Singapore and Sydney will be able to take advantage of OneFS for Google Cloud at launch, with additional global locations to be announced based on customer demand.

Source: dellemc.com

Tuesday 19 May 2020

How Dell Technologies and NVIDIA Support Natural Language Processing Technologies

I previously talked about the Rise of Deep Learning in the Enterprise and how its use is dramatically augmenting human capabilities. Gartner is predicting that the Artificial Intelligence (AI) Augmentation market will be $2.9 trillion by 2021. Let’s think about that number. Globally, there are only three countries with a GDP larger than $2.9 trillion, which means the AI market will be larger than most countries’ total GDP. Just a hint here: if you struggle to get an IT project green lighted, try incorporating an AI initiative into it. The value of AI is one of the primary reasons why enterprises are fast-tracking AI infrastructure projects. In this new blog series, we’ll focus on specific types of Deep Learning (DL) use cases and their impacts in the enterprise. The first one is Natural Language Processing (NLP).

Dell EMC Study Material, Dell EMC Tutorial and Material, Dell EMC Exam Prep

Natural Language Processing is the original end state dream of AI researchers. In fact, a key basis of the Turing Test is determining the ability of a machine to understand human language and respond. It was created by Alan Turing in 1950 to determine whether a computer can think. The basic tenet is to test if a machine can use language to fool a human into thinking it is a human being. Even 70 years later we have yet to convincingly pass the Turing test because, to put it simply, human language is hard.

Ever had a text or email lost in context? Chances are it happens every day. Not only do we have different languages that we speak, but we have different dialects within those languages. If we humans have a hard time understanding our language, then machines will struggle too. Now with the use of innovative DL models, machines are beginning to understand human language. In the enterprise, these applications are taking shape in three impactful areas: chatbots, text summarization, and voice interfaces.

NLP Chatbots


Chatbots are not new, but the technology has really improved in the last few years. Today you may not even be aware when you’re speaking to a machine (the Turing Test aside). Imagine using all your emails as training data to build an NLP DL model for a chatbot. All those with quick easily repeatable responses could be cleared out of your email box without you having to manually reply. For example, a technical support engineer is commonly asked how to reconnect their email in their email client. The simple answer that covers 85% of the cases has been solved hundreds of times in their email box. The email archive can train a model to solve the problem for use with a chatbot. Not all emails or customer responses should be handled by chatbots, but think of this use to efficiently triage common problems and respond to them quickly.

NLP Text Summarization


Let’s think back to high school. When school lets out, summer reading lists seem like they’ll be easy to tackle over the coming months, but too many fun things can get in the way. So, time flies, summer ends and those books were never opened before the first day of class. Enter CliffsNotes, the invaluable reference guides summarizing classic books for the student procrastinator or those needing a good review. NLP is helping to bring this functionality to the enterprise with text summarization. Now, hours of re-reading notes from a meeting that took place months ago can be reduced to minutes. Or what about career and professional development?

Many people struggle to keep up with the research in their field. Text summarization can help consolidate the high-level points about what’s new and deliver an easily consumable brief. Another use case is reducing the amount of time customer support engineers spend getting up to speed on a critical support issue. Saving minutes or hours for the engineer allows them to more effectively resolve the problem. Thus, NLP text summarization won’t replace reading, but it can help speed up cognition and time to results.

NLP Voice interfaces


Smart speakers and voice assistants are prolific in the consumer space. For example, my 8-year-old uses her smart speaker to help with homework. NLP is at the heart of these emerging voice interfaces and now it’s being deployed in organizations. Remember my doctor visit story a few months ago? Does he really need to carry around a voice recorder only to have his notes dictated and transcribed later? Not with NLP. Once the doctor leaves the patient room their notes can be automatically transcribed, uploaded and made available to their laptop when they need them. Healthcare will benefit greatly from these developments, but voice interfaces will not stop at smart devices for dictation. Enterprise users will continue to request the use of voice as an interface for such tasks as generating sales reports to voice enabled research assistants. I predict there will be a surge of voice interfaces in the enterprise.

Building an Architecture for Natural Language Processing


These three emerging NLP use cases, and many others for AI and DL, require an optimal IT infrastructure to deliver expected user experiences and results. For instance, training NLP models to understand different dialects, voices and tones requires massive amounts of data, perhaps ranging from terabytes to petabytes. And the NLP will also generate even more data. Since AI initiatives start with data first, it’s important to consider the storage required for this most valuable asset.

It’s equally imperative to contemplate your partners for the journey. Dell Technologies and NVIDIA are focused on helping our customers realize the value of their data with innovative AI solutions. Customers trust our expansive portfolios of best of breed hardware and software offerings to deliver high performance and scalable IT Infrastructure from sandbox proofs of concept to large-scale enterprise production. To this end, we’ve delivered Dell Technologies Ready Solutions for AI as well as reference architectures based on Dell EMC Isilon scale-out NAS with NVIDIA DGX-1™ and NVIDIA DGX-2™.  And we’re looking forward to building on these efforts with the release of the new NVIDIA DGX™ A100 system. In the upcoming months, we plan to begin testing, validating and certifying NVIDIA DGX A100 systems with our Emmy Award–winning Dell EMC Isilon scale-out NAS. Stay tuned for new solutions and reference architectures built around these essential elements of a high performance, scalable AI IT Infrastructure from Dell Technologies and NVIDIA.

Sunday 17 May 2020

Future Proof Your Dell EMC PowerStore Investment with Anytime Upgrades

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Certification, Dell EMC Exam Prep

Today, we announced Dell EMC PowerStore, a game-changing storage infrastructure platform engineered to solve modern data center challenges with data-centric design, intelligent automation and adaptable architecture. PowerStore appliances also come with several options from Dell Technologies Services to provide the best service experience, and we’re excited to expand on the portfolio with the introduction of Anytime Upgrades, the industry’s most flexible controller upgrade program.

Providing Confidence from the Start


Dell Technologies Services experts will get your new PowerStore up and running fast with support through the life of the appliance with our portfolio of services such as Dell EMC ProDeploy and ProSupport Enterprise Suites.

With ProDeploy Plus, we’ll deploy your PowerStore so that you can quickly take advantage of your new appliance. And ProSupport Plus, our most comprehensive support service, arms you with proactive and predictive support to maximize uptime and productivity, while significantly reducing IT effort. Additionally, when you’re buying a new PowerStore appliance, you’ll need to move workloads, applications, and files. PowerStore includes new native tools that automate the migration process, but we also offer a range of migration services to help whether you have an existing Dell EMC or third party storage device. We also offer training and certification programs to provide your IT teams with the skills to administer and manage your new PowerStore appliance.

Starting today, customers get the best of our product and services innovation, with the Anytime Upgrades program, which gives you the freedom to enhance your PowerStore infrastructure to meet changing needs.

Unmatched Flexibility with Anytime Upgrades


The Anytime Upgrades program is designed to expand and enhance PowerStore over time, providing greater choice, predictability and investment protection. Together, the Anytime Upgrades and the PowerStore adaptable architecture effectively ends the traditional cycle of disruptive storage platform migration with simple, flexible data-in-place upgrades — without downtime or impact to applications. And unlike other programs, Anytime Upgrades may be executed at any time (after 180 days) within your service contract as opposed to waiting years.

Because this program is an add-on to ProSupport Plus, our highest level support offer, you’ll have priority access to specialized support experts and an assigned Technology Services Manager, who serves as a trusted advisor for support planning and technology decisions.

Anytime Upgrades are available in three options:

◉ Next-Gen: Upgrade appliance nodes (controllers) to the next generation equivalent models

◉ Higher Model: Upgrade to more powerful nodes within the current generation

◉ Scale-Out: Apply a discount to expand your environment with a second appliance equal to the current model

When using the Next-Gen or Higher Model offers, the new nodes are deployed using ProDeploy Plus and supported under your existing ProSupport Plus contract at no additional cost.

You’re in good hands


Whether purchasing a single PowerStore appliance or planning a clustered environment, we know that flexibility, simplicity, support and speed of execution are crucial. With the best support and deployment services options available and now with Anytime Upgrades, customers can experience a highly flexible service with an accelerated path to productivity.

Saturday 16 May 2020

Dell Technologies and SONiC: Open Source Networking That Checks All the Boxes

Dell EMC Study, Dell EMC Certification, Dell EMC Guides, Dell EMC Tutorial and Material, Dell EMC Exam Prep

As a company that is driven by innovation, Dell Technologies has a long history of engagement with the open source community.  One of those areas that is particularly exciting to me is the open source network operating system SONiC—Software for Open Networking in the Cloud.

Today we announced our Enterprise SONiC Distribution by Dell Technologies, taking the next step in our participation which extends back to the early stages of its design and development. As the momentum behind SONiC builds and more and more major players realize the value of being involved with the project, we are leading the industry by adding more features, leveraging the design of SONiC to apply it in use cases that were never before imagined or considered, and delivering unrivaled services and support to ease and speed deployment.

SONiC is based on a disaggregated, containerized, micro-services architecture, a new trend in network operating systems that delivers seemingly unlimited opportunities to innovate and simplify the management of complex, massive-scale network environments.

Dell Technologies’ contributions to the platform include:

◉ Linux kernel development support and contribution, including ongoing kernel updates and security patches
◉ Onboarding multiple switch platforms and contributing drivers
◉ Platform command-based CLI code and utilities

Today’s announcement of our Enterprise SONiC Distribution highlights our commitment to provide ongoing support for SONiC development and contributions to the SONiC open source community. Examples of that ongoing commitment include:

◉ A new management framework, contributed jointly with Broadcom, which will allow SONiC to support industry standard CLI, REST and other popular northbound interfaces

◉ Additional platform onboarding and support to expand customer choices

◉ Improving integration with the popular FRRouting stack

◉ Improving access security and AAA solutions

In addition, we continue to innovate on SONiC in conjunction with other projects for distributed compute and networking. The flexible architecture of SONiC expands the deployment options significantly.

And of course, Dell Technologies brings its industry-leading services and support to simplify even the most complex deployments. Our investment in SONiC bridges the known industry gap between the innovation of open source projects with community collaboration, and the enterprise-grade support that is essential for large-scale deployment of mission-critical applications. Our 24×7 global customer services and support covert all SONiC applications and services, from the platform and Switch Abstraction Interface, to northbound interfaces and DevOps/automation tools. All of this makes SONiC on Dell EMC PowerSwitch hardware the platform to choose for open source networking in the enterprise.

At Dell Technologies, we see great potential for SONiC to play an increasingly important role in many of our next-generation technology solutions, and we are actively working to increase the adoption of SONiC in the technical communities.

Thursday 14 May 2020

To the Edge and Beyond: What Does a Programmable Fabric Look Like?

In the first blog in this series we talked about programmable fabrics and their use causes. In this blog we’ll look at what a programmable fabric actually looks like.

The following diagram shows the high-level architecture of a programmable fabric:

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Certification,Dell EMC Exam Prep

The programmable fabric can be broken down into two main layers, the control plane and the data plane.

Control Plane Layer


The control plane layer is responsible for configuring and managing the data plane and is normally more centrally located, i.e., one per PoP or region.

The control plane is normally divided into three separate domains – Fabric, Telemetry & Configuration and Management –  to allow them to scale independently. However, they could be implemented in one software controller, for example in a small-scale implementation.

1. Fabric Controller

The Fabric Controller controls the loading and programming of the data plane pipeline using the P4 Runtime interface to communicate with the data plane’s programmable forwarding engine as shown in the diagram below.

There will be a number of controller applications or “network functions” that talk to the fabric controller to control various aspects of the programmable fabric.

The Fabric Management applications manage the underlying network fabric setup and configuration. It can also be thought of as a number of virtualized switch and router network functions that provide the underlying network fabric using the programmable fabric.

The Fabric Management applications rely on user plane functionality being implemented in the P4 pipeline in the PFE.

The NF control plane uses a CUPS (Control User Plane Separation) methodology to implement the control plane portion of a Network Function while the user plane functions are pushed down into the “data plane node” as described in this document.

2. Telemetry Controller

The Telemetry Controller allows applications (i.e. Fault Management) to collect telemetry on the network elements in the programmable fabric using the Programmable Fabric’s gNMI streaming interface. It is expected that other applications will use things like machine learning to provide more intelligent decisions and provide control loop feedback into the Fabric Controller applications to provide pre-emptive service reconfiguration and repair as we move towards autonomous networks.

3. Configuration and Management Controller

The Configuration and Management Controller will provide applications with common north bound interfaces and models for the configuration and management of the programmable fabric.

The OpenConfig group  provides a set of network data models that allow network functions to be managed using a common set of tools and protocols.  The gNMI and gNOI interfaces use the OpenConfig models to allow efficient access to configure and manage the network functions in the Programmable Fabric.

Data Plane Layer


The data plane does the bulk of the network traffic forwarding only sending exception or control packets up to the control plane for processing (i.e. DHCP for a new IPoE session in a BNG-c).  While the data plane might normally be thought of as a standalone network switch in the network it could also be a SmartNIC in a compute server that allows the programmable fabric to be extended up into the server (i.e. using P4 to define a pipeline in an FPGA SmartNIC).

The data plane is normally made up of several components:

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Certification,Dell EMC Exam Prep
1. Data Plane Node (DPN): is used to describe the hardware that houses the data plane forwarding function (i.e. all the components below).  This could be a stand alone network switch with a PFE like Intel/Barefoot’s Tofino chip or a compute server with a P4 based SmartNIC like Intel’s PAC N3000.

2. Data Plane Agent (DP-Agent): provides the standardised north bound data plane interfaces (i.e. P4 Runtime, gNMI and gNOI) that allow the control plane network functions to communicate with the data plane.  An example implementation of the DP-Agent is the ONF’s Stratum project.

3. Network Function user plane (NF-u): the user plane portions of network functions can be defined in the programmable pipeline (i.e. using P4 for example) and then loaded into the PFE to process packets.  These functions are programmed by their control plane counters parts (i.e. BNG-c, UPF-c, Fabric Manager-c) in order to handle the bulk of the traffic in the PFE without needing to go up to the control plane for processing.

4. Programmable Forwarding Engine (PFE): the actual hardware that does the packet forwarding. Some examples of a PFE could be the P4 based switch chipset like Intel/Barefoot’s Tofino chipset, or another could be an FPGA based SmartNIC using P4 to define the packet forwarding pipeline.

Dell Technologies is committed to driving disaggregation and innovation through open architectures and the competitiveness this brings to our customer’s networks. The high-level architecture described in this blog is in line with the Open Networking Forum’s Stratum and NG-SDN projects and provides open building blocks that allow telecommunication providers to build open, scalable and cost effective edge solutions.

Tuesday 12 May 2020

Keep Focus on Workloads with a Future-Proof Infrastructure

When you have the reassurance that your IT infrastructure investments are protected, you can worry less about the critical workloads that keep your organization running. So far in our Direct2DellEMC series on optimizing your infrastructure for critical business workloads, we have been discussing how selecting the right technology provider can make the difference in organizations delivering superior outcomes. Today we will take that conversation one step further and explore the benefits of a total package – advanced technologies, coupled with guarantees, offers and assurances.

Our advice is to center your decision-making around the use cases, applications and databases that have a most direct impact on your organization’s strategic differentiation. By keeping focus on the performance and simplicity required by the workloads that drive the unique capabilities your business relies on for differentiation, you can optimize your infrastructure with each investment. But before you take your next step in infrastructure modernization, consider if the strategy you are following is future-proof.

Dell EMC Study Materials, Dell EMC Tutorial and Material, Dell EMC Certification, Dell EMC Exam Prep

As IT leaders have learned throughout their careers, eliminating unknowns increases the likelihood of positive outcomes down the road. The best way to sleep at night is to know that your investments are protected.

In addition to our full stack of IT solutions and the services offered by our experts, we know organizations turn to Dell Technologies because of our establishment as a leader in the industry. And we remain one of the most trusted technology providers around the globe because in addition to helping your IT staff enact modernization efforts, we ensure that your satisfaction is guaranteed.

To bring you that further peace of mind, we have made a set of promises through our Future-Proof Program that guarantee our resilient storage, data protection, networking and hyperconverged products deliver value throughout their lifespan. The program is designed to help you optimize the IT lifecycle through a series of guarantees, offers, and assurances. Future-Proof provides support from beginning to end by guaranteeing outcomes, maximizing investments and helping you navigate the future of IT.

Future-Proof guarantees outcomes by going above and beyond expectations with guarantees that our world class technology capabilities will deliver as promised. Because we stand firmly behind our solutions, we hope you can plan to support your workloads confidently, assured that you will neither over- nor under-provision for your business needs.

Dell EMC Study Materials, Dell EMC Tutorial and Material, Dell EMC Certification, Dell EMC Exam Prep
Future-Proof maximizes your investments by ensuring that you can easily purchase, exchange, and upgrade IT solutions to seamlessly modernize technologies today and into the future. Future-Proof also helps you to navigate IT futures with technologies and features designed to flexibly operate on premises or in the cloud, today and in the future. By eliminating future cost uncertainties for acquired solutions, you will be able to better budget for new IT needs as your workloads demand.

Don’t forget that our advanced offerings across the infrastructure stack are available with flexible payment options through Dell Technologies On Demand, which includes value-added services with ProDeploy, ProSupport and Managed Services. These services can be bundled effortlessly and paired with all the financial consumption models and with the Future-Proof program.

Dell Technologies has the right technology and programs to provide your IT staff with the peace of mind that 2020 demands. And the Future-Proof program enables customers to focus on critical business needs while Dell Technologies handles the rest. By delivering the performance, scalability and resiliency you require across our powered-up portfolio of infrastructure, we help ensure your IT investments will successfully support the workloads that your business depends on to grow.

Source: dellemc.com

Sunday 10 May 2020

Dell EMC CloudIQ: Enabling Faster Time to Insight

Dell EMC Study Materials, Dell EMC Certifications, Dell EMC Cloud, Dell EMC Exam Prep

Customers love Dell EMC CloudIQ because it delivers actionable insights by combining machine learning and human intelligence to deliver real-time performance and capacity analysis plus historical tracking all in a single-pane glass view.

Data grows exponentially each year, with budgets and staffing not growing at the same rate the need for tools that can continue the transition to more autonomous infrastructure is becoming increasingly essential. Making the pivot from to a proactive management is taking that next step towards the autonomous data center. It’s only natural that CloudIQ would evolve to not only provide broader support, but also streamlining management of your data center. CloudIQ supports all major Dell EMC storage platforms, Connectrix switches, and VxBlock converged infrastructure, and we are excited to share that it will continue to expand across the Dell Technologies infrastructure portfolio for even broader data center insights. As we bring CloudIQ across the portfolio, you’ll also see Dell Technologies introduce new features and functionality that are designed to ease management and drive more automation in the data era.

In the past year, the CloudIQ team has been hard at work with our product teams to enhance features and add new functionality to help users streamline administrative tasks and to simplify infrastructure management. Leveraging machine learning, CloudIQ helps anticipate customers’ problems turning predictive analytics into actionable insights, enabling algorithms to be continuously updated leveraging Dell EMC product and subject matter expertise. Data is collected on an on-going basis, combined with industry best practices to address the most potentially impactful issues. This provides IT administrators with intel they need to take quick action and more efficiently manage their data center environment.

FASTER TIME TO INSIGHT WITH CLOUDIQ


CloudIQ provides streamlined functionality such as performance and capacity anomaly detection, performance impact analysis, and workload contention identification. With a simple and easy interface, detecting and troubleshooting issues is even easier with CloudIQ.

Faster Time to Insight

With over 30,000 arrays CloudIQ connected, processing 30 billion data points per day and adoption growing at a rate of over 2,000 systems each month, CloudIQ is continuously getting smarter to better inform users by arming them with actionable insights.

Reduce Risk

CloudIQ makes daily storage administration tasks easier by helping you identify potential issues before they impact your environment. CloudIQ proactive health scores give you an at-a-glance view of issues across your environment, prioritizing them so for users surfacing the most imminent risk so quick appropriate action can be taken. Performance anomaly detection and impact analysis use machine learning to zero in on incidents that had an impact on the environment and need remediation. CloudIQ’s VMware integration enables end-to-end analysis of VM activity in the context of the storage systems they are managing, without having to access or view a separate portal.

Plan Ahead

CloudIQ helps to stay ahead of business needs with capacity planning tools such as Capacity Full Prediction which allows you to plan for future budgetary needs. Capacity Anomaly Detection identifies a sudden surge of capacity utilization that could result in imminent Data Unavailability, helping to avoid 2am phone calls.

Improve Productivity

CloudIQ helps you make the most of your resources, as both staff time and equipment can be optimized providing a single pane-of-glass view of your environment. You can enable CloudIQ across ALL major systems in the Dell EMC storage platforms, Connectrix switches, and VxBlock converged infrastructure. The breadth of support gives users broad oversight of data center health, with plans to extend support across all ISG portfolio products. The CloudIQ mobile app makes it even easier to check on your data center environment anywhere and anytime.

For additional oversight you can grant your account team Trusted Advisor access to receive timely best practice recommendations and guidance to optimize your environment and prevent potential issues, often before you even know there is a problem. Trusted Advisors were asked to evaluate time to resolution for common scenarios with and without CloudIQ, and on average Trusted Advisors reported being able to resolve issues on average 3x faster using CloudIQ.

Thursday 7 May 2020

Inside Dell EMC PowerStore: AppsON Delivers Groundbreaking Application Flexibility and Mobility

We are in a new era of IT, one fueled by the explosion of data and technology innovation. Many organizations are on the cusp of becoming digital powerhouses in this new era, but two things stand in their way:

1. Data is created, processed, and stored everywhere—from edge locations to core data centers to public clouds.

2. IT organizations are expected to support an ever-increasing number of workloads, everything from traditional applications to edge analytics, all while delivering greater levels of simplicity, agility, efficiency, and cost-effectiveness.

Today, we are proud to announce Dell EMC PowerStore, a pioneering new modern infrastructure platform built from the ground up with best-in-class expertise and technology to address the challenges of the data era.

One of PowerStore’s game-changing features is AppsON, an industry-first capability that allows VMware virtualized workloads to run directly on the purpose-built array, delivering groundbreaking application mobility and flexibility. Let’s take a closer look at what makes AppsON special, ideal workloads, and how it can complement your existing storage and infrastructure investments.


What Is AppsON?


The unparalleled flexibility and mobility provided by AppsON is made possible because PowerStore is the only purpose-built array with VMware vSphere¹ built-in. Integration with vSphere results in simplified, streamlined management where storage resources plug directly into the virtualization layer.

Storage administration is also made simpler, as supporting data management applications can be run directly on the array, streamlining operations and consolidating targeted external VMs. The consolidated solution provided by PowerStore with AppsON offers unique capabilities for environments where infrastructure simplicity and density are desirable or critical, including edge computing, ROBO, mobile and tactical deployments.

Dell EMC Study Materials, Dell EMC Tutorial and Materials, Dell EMC Certification, Dell EMC Exam Prep

What Workloads are Ideal for AppsON?


AppsON is ideal for a variety of workloads, namely infrastructure and data intensive applications. Infrastructure applications include anything that an administrator needs to run their data center, including anti-virus, data protection and monitoring software. This enables an administrator and their broader infrastructure team to simplify operations and have full control over their storage environment.

Data-intensive applications fall in two categories – those that are latency sensitive and those that require an imbalance of storage vs. compute. These include but aren’t limited to applications that require a small footprint and to process and store vast amounts of data, such as edge and analytics applications.

How Does AppsON Complement Existing Infrastructure?


AppsON further benefits IT organizations by providing new flexibility while continuing to leverage existing infrastructure investments.

It complements existing platforms, including Dell Technologies’ #1 HCI solution VxRail by providing a landing zone for storage-intensive workloads that require superior data efficiency and “always on” data reduction in the smallest of form factors.

In addition, existing investments in infrastructure and processes can be preserved as a PowerStore using AppsON can serve storage to external servers via FC and iSCSI, just like a regular SAN block array, while simultaneously running enterprise Virtual Machines with VMware vSphere internally.

PowerStore cluster management, combined with VMware vSphere including vMotion and Storage vMotion, enable seamless application mobility between PowerStore and other VMware targets. Using a single storage instance, applications can be deployed on networked servers, hyperconverged infrastructure (i.e. VxRail), or directly on the PowerStore appliance and migrated transparently between them. This unparalleled agility enables IT and application owners to quickly and seamlessly deploy and reassign workloads to the most effective environment based on current requirements and available resources.

Where Does PowerStore Fit into My Existing VMware Environment?


Dell EMC PowerStore complements and extends your existing infrastructure investments, especially your VMware environment. With AppsON, PowerStore can provide both storage capacity for applications and a VMware based environment for hosting applications locally. PowerStore has comprehensive support for VMware environments. The vRealize Orchestrator plugin for Dell EMC PowerStore helps automate storage provisioning and operations, PowerStore’s innovative single architecture provides native vVol support and Cloud Storage Services can also provide Data Recovery as a Service (DRaaS) to VMware Cloud on AWS.

Because of innovation like AppsON, Dell EMC PowerStore can revolutionize your data center, and we’re excited for you to see that firsthand. 

Tuesday 5 May 2020

Unlocking Data Insights with PowerEdge and Microsoft SQL Server

Dell EMC Study Materials, Dell EMC Exam Prep, Dell EMC Certification

Understanding how to uncover insights and drive value from data can give your organization a distinct competitive advantage. This can be complicated since the IT landscape is constantly evolving, especially when it comes to data management and analytics. In a recently published report, ESG surveyed IT decision makers and “nearly two-thirds (64%) said they believe IT is more complex now compared with two years ago; and another 17% said it is significantly more complex.”

Organizations can navigate these complexities by focusing on unlocking data insights and bolstering their security posture to protect data.

Unlocking Data Insights


One of the biggest advancements in database technology came with the introduction of SQL Server 2019 and its new data analytics capabilities. Using R and Python, SQL’s Machine Learning Services can analyze data across multiple disparate data sources, not just the data contained within the SQL database. SQL is no longer just a database, it is the engine that will collect and analyze data – wherever that data lives and in whatever form – structured or unstructured. This eliminates the time and expense associated with data ingestion providing quicker insights to inform business decisions.

Also key is SQL 2019’s platform compatibility. For the first time, there is full feature parity with Red Hat Enterprise Linux and Kubernetes. A Linux shop can get the full benefit of SQL without having to refactor to Windows.

The reporting capabilities of SQL Server Reporting Services, along with the included Power BI Report Server, produce powerful reports that take ones and zeros and provide real information. The translation of data into knowledge gives business an edge.

In order to make the most of these cutting-edge capabilities, it is worth taking a look at the hardware that houses the data estate. When time is of the essence, it is important to be able to process this data efficiently, with little to no latency. That is one of the ways PowerEdge servers with Intel® Optane™ persistent memory can add value to the SQL server environment. Optane persistent memory enables transactions to be performed directly on the memory bus, eliminating the extra time needed for data to transfer between the storage module to the processor. In simpler terms, having the option of storage available in a memory slot gives data quick access to the “brains” of the server. Consider this, a Dell EMC PowerEdge R740xd server using Intel® Optane™ persistent memory delivered 2.2 times the Microsoft SQL Server 2019 performance of a two-NVMe drive configuration and improved performance even more significantly over SATA SSDs—delivering 11.3 times the transactions per minute.¹

SQL is Synonymous with Security


Protecting the data that is the foundation of business is top of mind. In fact, ESG reports that organizations are fighting the battle on two fronts: “40% of respondents identified the need to strengthen their cyber security position, but nearly half (44%) also cite chronic skill shortages in the area of cybersecurity.”

Fortunately, SQL is widely recognized as a secure data platform. It is designed with a number of security and compliance features, including the ability to encrypt sensitive data. It is important to know that security patches for SQL 2008 ended in July 2019 and SQL 2012 mainstream service life ended October 2018. Given these recent end of support dates and benefits, now is the optimal time to consider migrating to SQL Server 2019.

Protecting data infrastructure starts at the hardware level. PowerEdge servers are built with integrated security features such as the “always-on” iDRAC, which can provide system monitoring and alerts. Security features are built directly into the firmware to help block malicious attacks, detect deviant activity, and restore critical operations when necessary.

Additionally, Dell Technologies OpenManage systems management solutions can help simplify, automate, and optimize IT operations. For example, OpenManage Integrations for Microsoft System Center and Windows Admin Center enables visibility and control of hardware infrastructure, operating system, virtual machines, and containers.

Simplifying the Data Management Landscape


Dell Technologies and Microsoft have partnered for over 35 years. During that time, Dell has received numerous global competencies and over 50 Microsoft “Partner of the Year” awards. This long-standing relationship makes Dell Technologies the ideal infrastructure partner as we offer a broad suite of products to support Microsoft platforms.

Choosing to implement new software does not have to be complicated. Eliminate the need to manage multiple purchase orders or vendors with a PowerEdge and SQL 2019 OEM solution. Customers who choose to purchase SQL with their PowerEdge servers will be able to easily install the software on their PowerEdge platform.

Sunday 3 May 2020

Dell Digital Way at the Forefront of Partner Experience Evolution

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Certification

These days, just about everyone is embarked on some kind of Digital Transformation journey. What are the first things that come to mind when you hear that term? Cloud applications, mobility, artificial intelligence, and machine learning probably come to mind. Although technology innovations like these are accelerating progress at a rapid rate, the keys to driving lasting business transformation are rooted in traditional business practices.

Successful Digital Transformation initiatives have clear and accountable business ownership, agile collaboration with IT, a focus on policy simplification, process redesign and robust change management practices. Without this, the full value and opportunity of technology investment will never be realized.

At Dell Technologies, we embody this approach in a cohesive end-to-end strategy we’ve defined as the “Dell Digital Way.” It’s an ecosystem with an agile approach to the way we do business with our partners – prioritizing business needs, designing and deploying the right solutions with the best technology in the industry, and simplifying our policies and processes to create seamless end-to-end online experiences.

Dell EMC Study Materials, Dell EMC Guides, Dell EMC Learning, Dell EMC Certification

Delivering a Fast, Seamless and Intuitive Online Experience


This approach has enabled growth in our Channel business to deliver $52 billion in orders in the 12 months leading up to Q4. For our Partners, we’ve been on a multi-year journey, listening to feedback and driving consistent progress quarter-over-quarter to make it easier for our Partners to do more business with us. View our recent Mid-Year Partner Update.

As we work to close out the year, we are poised to break through in 2020 with a compelling set of end-to-end capabilities that will help our Partners further capitalize on our Simple. Predictable. Profitable. value proposition.

◉ Dynamic Deal Registration to quickly unlock both front-end and back-end profitability

◉ Integrated Configuration and Pricing to sell across the Dell Technologies portfolio and get to a winning price fast

◉ Predictable and Intuitive execution and administration of our Dell Technologies Partner Program

We have a winning strategy focused on the needs of our partners, deploying the best technology in the industry and driving lasting business change. I look forward to continuing this transformation journey together.

Source: dellemc.com