Monday, April 12, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Machine Learning

Nvidia Spotlights Data Science, AI and Machine Learning at GPU Technology Conference

March 19, 2019
in Machine Learning
Nvidia Spotlights Data Science, AI and Machine Learning at GPU Technology Conference
587
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Credit: Google News

You might also like

Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning

Why Machine Learning Over Artificial Intelligence?

27 million galaxy morphologies quantified and cataloged with the help of machine learning

Data science is the new ground zero for Nvidia, which introduced new software AI libraries running on its Tensor Core GPUs at its GPU Technology Conference in San Jose yesterday. The Cuda-X AI libraries are written to speed up machine-learning and data-science operations by as much as 50x, the company said, with far-reaching implications for such AI applications as speech and image recognition as well as risk assessment, fraud detection and inventory management.

Nvidia said the Cuda-X AI software stack includes cuDNN for deep-learning primitives, cuML for machine-learning algorithms, TensorRT for optimizing trained models for inference, and other libraries. Running on Tensor Core GPUs, they can be integrated into frameworks for deep learning such as TensorFlow, PyTorch and MCNet or popular cloud platforms.

Nvidia T4 Tensor Core GPU
Nvidia

In order to make Cuda-X accelerated analysis available to a broad range of users, Nvidia said its T4 Tensor Core GPUs will be available via Amazon EC2 G4 instances “in the coming weeks.” When those go online, they will be available for machine-learning applications including real-time ray tracing, simulation and rasterization. Eager AWS users can apply for an advance preview of EC2 G4 instances by filling out a form at the AWS website. Meanwhile, the RAPIDS open-source suite of libraries, which allows users to access parallel GPU processing and high-speed memory via Python interfaces, is already available via the Microsoft Azure Machine Learning service.

Workstations Tuned for Data Science

And Nvidia is also contributing its hardware and data-science software stack, including Cuda-X AI, to a new class of high-powered workstations tuned specifically to work in concert with data centers and power high-end data science applications. According to Bob Pette, Nvidia’s VP and GM of professional visualization and Quadro graphics, the new Nvidia-certified systems will be built and provided by an array of vendors including, globally, Dell, HP and Lenovo and, regionally, AMAZ, APY, Azken Muga, BOXX, CADNetwork, Carri, Colfax, Delta, EXXACT, Microway, Scan, Sysgen and Thinkmate. Optional enterprise support contracts will be sold by those OEMs and supported through Nvidia, Pette said.

As an example of performance targets for the new systems, Pette said an accelerated library for data scientists running on dual Quadro RTX8000 GPUs could achieve greater accuracy and 10 times faster turnaround compared to CPU processing nodes.

Real-Time Ray Tracing Is a Reality

Nvidia said its RTX platform, which includes GPU hardware custom-designed for Tensor core-accelerated ray-tracing, has made inroads in the industry since its introduction at SIGGRAPH last year — including an endorsement from Pixar Animation Studios, which said it will use RTX on its upcoming films. ILM, Image Engine, MPC Film and Weta Digital are also said by Nvidia to be using RTX in their VFX workflow. The software products incorporating RTX in 2019 releases include Adobe Dimension & Substance Designer, Autodesk Arnold & VRED, Chaos Group V-Ray, Dassault Systèmes CATIALive Rendering & SOLIDWORKS Visualize 2019, Daz 3D Daz Studio, Enscape Enscape3D, Epic Games Unreal Engine 4.22, ESI Group IC.IDO 13.0, Foundry Modo, Isotropix Clarisse 4.0, Luxion KeyShot 9, OTOY Octane 2019.2, Pixar Renderman XPU, Redshift Renderer 3.0, Siemens NX Ray Traced Studio, and Unity Technologies Unity (2020).

Nvidia RTX Blade Servers

Nvidia RTX Blade Servers
Nvidia

On the server side, the company debuted a new RTX Server configured with 1,280 Turing GPUS on 32 RTX blade servers, each squeezing 40 GPUs into an 8RU space. With low-latency access to such servers, Nvidia argued, cloud-rendered videogames and AR/VR applications become feasible over a 5G network. Nvidia also welcomed an array of new manufacturers for its T4 servers, built for GPU-accelerated data analysis, including Cisco, Dell EMC, Fujitsu, HPE, Inspur, Lenovo and Sugon.

Last but not least, Nvidia revealed Omniverse, a new enterprise collaboration platform for studios working with real-time graphics. Supporting industry standard technology including Pixar’s Universal Scene Description and Nvidia’s own Material Definition Language, Omniverse maintains live, bi-directional connectivity between such applications as Autodesk Maya, Adobe Photoshop and Epic Unreal Engine. That allows artists using one application to see changes made by artists working in another application immediately. Naturally, the Omniverse Viewer supports RTX RT Cores, Cuda Cores, and Tensor Core-accelerated AI. Nvidia said Omniverse is as a “lighthouse” program; interested users can request access to the client SDK or inclusion in the program at Nvidia’s developer site.


Credit: Google News

Previous Post

The Hottest Job Of The 21st Century(Data Scientist!)

Next Post

CMU Hosts Discussion of Ethics of AI Use by Department of Defense

Related Posts

Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning
Machine Learning

Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning

April 11, 2021
Why Machine Learning Over Artificial Intelligence?
Machine Learning

Why Machine Learning Over Artificial Intelligence?

April 11, 2021
27 million galaxy morphologies quantified and cataloged with the help of machine learning
Machine Learning

27 million galaxy morphologies quantified and cataloged with the help of machine learning

April 11, 2021
Machine learning and big data needed to learn the language of cancer and Alzheimer’s
Machine Learning

Machine learning and big data needed to learn the language of cancer and Alzheimer’s

April 11, 2021
Basic laws of physics spruce up machine learning
Machine Learning

New machine learning method accurately predicts battery state of health

April 11, 2021
Next Post
CMU Hosts Discussion of Ethics of AI Use by Department of Defense

CMU Hosts Discussion of Ethics of AI Use by Department of Defense

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning
Machine Learning

Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning

April 11, 2021
Why Machine Learning Over Artificial Intelligence?
Machine Learning

Why Machine Learning Over Artificial Intelligence?

April 11, 2021
27 million galaxy morphologies quantified and cataloged with the help of machine learning
Machine Learning

27 million galaxy morphologies quantified and cataloged with the help of machine learning

April 11, 2021
Machine learning and big data needed to learn the language of cancer and Alzheimer’s
Machine Learning

Machine learning and big data needed to learn the language of cancer and Alzheimer’s

April 11, 2021
Job Scope For MSBI In 2021
Data Science

Job Scope For MSBI In 2021

April 11, 2021
Basic laws of physics spruce up machine learning
Machine Learning

New machine learning method accurately predicts battery state of health

April 11, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Cambridge Quantum Computing Pioneers Quantum Machine Learning Methods for Reasoning April 11, 2021
  • Why Machine Learning Over Artificial Intelligence? April 11, 2021
  • 27 million galaxy morphologies quantified and cataloged with the help of machine learning April 11, 2021
  • Machine learning and big data needed to learn the language of cancer and Alzheimer’s April 11, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates