Saturday, January 16, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Technology Companies

Send SMF data directly to the IBM Db2 Analytics Accelerator

August 3, 2019
in Technology Companies
On demand data in Python, Part 3: Coroutines and asyncio
606
SHARES
3.4k
VIEWS
Share on FacebookShare on Twitter

Credit: IBM

Use the new capabilities of TDSz and CDPz

You might also like

The Eclipse Foundation’s move to Europe will bring open standards and open source together – IBM Developer

Kafka Monthly Digest – December 2020 and 2020 in review – IBM Developer

Learn new skills while working on volunteer projects – IBM Developer


If you are already using IBM Db2® Analytics Accelerator and
Tivoli® Decision Support for z/OS® (TDSz), you can now directly
load data to Db2 Analytics Accelerator with the TDSz schema, without storing data in Db2
tables. With this feature, you can save CPU and space on expensive storage
and also store and query data over a longer period on a more detailed
level. This tutorial shows you how.

First, let’s look at a few products and features.

Common Data Provider for z/OS (CDPz) is a single source
that helps bring all operational data from the z/OS platform to your
analytics platform of choice, such as Operations Analytics for z
Systems®; Splunk; and Elasticsearch, Logstash and Kibana (ELK).

Tivoli Decision Support for z/OS (TDSz) collects log data
from various sources and provides a central repository for easy access to
historical enterprise-wide IT utilization information and statistics. This
information is valuable for use in providing performance reporting, data
for service-level management, and usage accounting. TDSz uses Db2 for z/OS
as its repository.

IBM Db2 Analytics Accelerator for z/OS is a
high-performance appliance that helps you query data at high speeds and
extend the capabilities of Db2 for z/OS by lowering operating costs.

Tivoli Decision Support for z/OS now includes analytics
components that are designed to support the Db2 Analytics Accelerator.
These components are based on existing non-analytics components, which are
modified to allow for the following:

  • Storing data directly to Db2 Analytics Accelerator, thus
    removing the need to store data on Db2 for z/OS.
  • Storing more detailed timestamp-level records.
  • Moving more CPU work from z/OS to the Db2 Analytics Accelerator
    appliance.
  • Reporting to make use of the high query speeds of the Db2
    Analytics Accelerator.

“You can now directly load data to Db2 Analytics Accelerator with the TDSz
schema, without storing data in Db2 tables, saving CPU and space on
expensive storage. This tutorial shows you how.”

The System Data Engine (SDE) component of the Common Data Provider to z/OS
supports writing data in Db2 internal load format (batch mode). The IBM
Db2 Analytics Accelerator Loader for z/OS is then used to load the Db2
internal format data sets directly into the IBM Db2 Analytics
Accelerator.

The collection is currently not real time. The system data engine reads
from SMF system management facilities data sets, as shown here.

Chart showing reading of       SMF data sets

Chart showing reading of SMF data sets

What you will need

You need the following products to send TDSz data directly to Db2 Analytics Accelerator:

  • Tivoli Decision Support for z/OS 1.8.1
  • Common Data Provider to z/OS 1.1
  • Db2 Analytics Accelerator 5.1 and Loader 2.1

In addition, you need the following maintenance packages:

  • Product temporary fixes (PTFs) for CDPz APAR OA52196 and OA52200
    provide SHBODEFS members required to Db2 internal load format output
    ready to load into Db2 Analytics Accelerator using TDSz analytics schema.
  • PTF for TDSz APAR PI70968 provides TDSz enhancements and analytics
    schema.
  • The Db2 Analytics Accelerator Loader for z/OS V2.1 uses Db2 Analytics Accelerator-only load
    mode to load the data created by the SDE into the Db2 Analytics Accelerator.

Initial support is limited to:

  • Analytics – z/OS Performance
  • Analytics – Db2
  • Analytics – KPM CICS
  • Analytics – KPM Db2
  • Analytics – KPM Z/OS

KPM components of Tivoli Decision Support for z/OS are available only with
Version 1.8.2.

Implementation using Accelerator-only tables

There are a couple of ways you can implement sending SMF data directly to
Db2 Analytics Accelerator tables. This section explains how you can do it in one step. When you
install the analytics component in TDSz, the tables are created directly
in Db2 Analytics Accelerator. In the next section (“Implementation not using Accelerator-only tables”), you start by
creating the tables first in Db2 and then you can move the tables into
Db2 Analytics Accelerator when ready.

Bind the Db2 plan used for Tivoli Decision Support

After applying the required PTF and running all the actions, including
refreshing Tivoli Decision Support for z/OS system tables, do the
following:

  1. Select Reports.Screen capture showing           selection of Reports
    Screen capture showing selection of Reports
  2. Select System.Screen capture showing           selection of System
    Screen capture showing selection of System
  3. Select Dialog parameters.Screen capture showing           selection of Dialog parameters
    Screen capture showing selection of Dialog parameters
  4. Select System tables. Press F6 to
    update the system tables.Screen capture showing           selection of System tables

    Screen capture showing selection of System tables

You need to rebind the Db2 plan used for Tivoli Decision Support by
specifying the BIND option
QUERYACCELERATION(ELIGIBLE) or
QUERYACCELERATION(ENABLE).

You can use the DRLJDBIN sample from the SDRLCNTL data set (the one used to
create TDSz database).

//DBINIT   EXEC PGM=IKJEFT01                                      
//STEPLIB  DD  DISP=SHR,DSN=hlq_db2.SDSNEXIT             
//         DD  DISP=SHR,DSN=hlq_db2.SDSNEXIT
//DBRMLIB  DD  DISP=SHR,DSN=hlq_tds.SDRLCNTL(DRLPSQLX)     
//SYSPRINT DD  SYSOUT=*
//SYSTSPRT DD  SYSOUT=*                                           
//SYSTSIN  DD  *                                                  
 DSN SYSTEM(dsn)                                                 
   BIND PACKAGE(DRLPLAN) MEMBER(DRLPSQLX) -                       
     ACTION(REPLACE) ISOLATION(CS) ENCODING(EBCDIC)  
     QUERYACCELERATION(ELIGIBLE)                
   BIND PLAN(DRLPLAN) PKLIST(*.DRLPLAN.*) -                       
     ACTION(REPLACE) RETAIN                                       
                                                                  
   RUN PROGRAM(DSNTIAD) PLAN(DSNTIA12) -                          
     LIB('hlq_db2.RUNLIB.LOAD')                          
 END

The QUERYACCELERATION parameter is allowed only for Db2
subsystems that are enabled for query acceleration.

Modify the TDSz
profile data set

Now, if you want to use AOT, modify the DRLFPROF data set with the new
parameters.

/*---------------------------------------------------------------*/
/* IDAA Enablement                                               */
/*---------------------------------------------------------------*/
def_useaot      = "YES"                                            
def_accelerator = "HOLIDAA"                                        
def_timeint     = "T"
  • Change def_useaot to “YES” if you want to use
    Accelerator-only Table. Leave it at “NO” if you want to decide
    later.
  • def_accelerator = “xxxxxxxx” (the name of the
    Accelerator).
  • def_timeint = “H” | “S” |
    “T”

    • “H” – timestamp is rounded to an hourly
      interval.
    • “S” – timestamp is rounded to a second’s
      interval.
    • “T” – timestamp is the actual timestamp
      (default).

Install
analytics components

As with any other component, you install analytics components using
Administration Dialog > Components (1.2).

  1. Select Reports.Screen capture showing           selection of Reports
    Screen capture showing selection of Reports
  2. Select Components.Screen capture showing           selection of Components
    Screen capture showing selection of Components
  3. Select the component(s).Screen capture showing           selection of specific components
    Screen capture showing selection of specific components
  4. Press F6 to begin the installation.Screen capture showing           pressing of F6
    Screen capture showing pressing of F6
  5. Select the desired parts of the component.Screen capture showing           selection of desired parts of the components
    Screen capture showing selection of desired parts of the components
  6. Choose Batch.Screen capture showing           selection of Batch
    Screen capture showing selection of Batch
  7. To start batch processing, enter Submit on the command
    line.

The analytics tables are created directly in Db2 Analytics Accelerator. After all analytics
components are installed, you can check their statuses on the Components
panel.

Screen capture showing           selection of Components panel

Screen capture showing selection of Components panel

The tables are also visible through TDSz panels. All analytics tables start
with A_.

Screen capture showing           selection of TDSz panels

Screen capture showing selection of TDSz panels

In addition, there are some sample reports that you can use through the
reporting dialog. All analytics sample report IDs start with A.

Screen capture showing           selection of sample reports

Screen capture showing selection of sample reports

Because analytics components are based on classic components, it should not
be difficult to copy any existing report that runs on a classic table and
convert it to run on an analytics table.

Customize LOOKUP
definitions

Now, if necessary, you should customize the LOOKUP definitions of analytics
components to reflect the contents of classic lookup tables. Each
analytics component lookup table is based on a non-analytics component
lookup table. Customize lookup tables only if you are using Tivoli
Decision Support for z/OS to collect and populate the component tables on
Db2 for z/OS (in this scenario, you are not using TDSz collect), or if you
are using Tivoli Decision Support for z/OS reporting (you might be using
this).

You should customize members of the Common Data Provider if you are
collecting data into only the Db2 Analytics Accelerator and not having
the data reside on Db2 for z/OS. You will then need to configure the
lookup tables in the Common Data Provider to z/OS (in this scenario, you
are collecting data into Db2 Analytics Accelerator). It would be best to copy members from the
SHBODEFS data set to some newly created LOCAL.DEFS data set.

Table 1. CDPz Lookup table members
HBO v rm. SHBODEFS member name Analytics component lookup table name Based on non-analytics component lookup table name
HBOTA2AP A_DB2_APPLICATION DB2_APPLICATION
HBOTA2AC A_DB2_ACCUMAC DB2_ACCUMAC
HBOTALUG A_USER_GROUP USER_GROUP
HBOTALKP A_KPM_THRESHOLDS_L KPM_THRESHOLDS
HBOTALWL A_WORKLOAD2_L MVS_WORKLOAD2_TYPE
HBOTALMI A_MIPS_L MVS_MIPS_T
HBOTALSP A_SYSPLEX_L MVS_SYSPLEX
HBOTALWL A_WORKLOAD_L MVS_WORKLOAD_TYPE
HBOTALW2 A_WORKLOAD2_L MVS_WORKLOAD2_TYPE
HBOTALDA A_DEVICE_ADDR_L MVSPM_DEVICE_ADDR
HBOTALUT A_UNIT
TYPE_L
MVSPM_UNIT_TYPE
HBOTALTR A_TIME_RES_L MVSPM_TIME_RES

COLLECT for
IDAA_ONLY tables

Now you are ready to collect SMF data to Db2 Analytics Accelerator.

CDPz collect

First, you should run JCL job(s) provided with CDPz to generate Db2
internal format data for the tables created for the Accelerator by the
analytics components. Modify and submit the following jobs from SHBOCNTL
to the LOCAL.CNTL data set to create staging data sets from the supplied
SMF log data set:

  • HBOA2DUN – Analytics – Db2
  • HBOAKDUN – Analytics – KPM Db2
  • HBOAPMUN – Analytics – z/OS Performance
  • HBOAKZUN – Analytics – KPM z/OS please, add
    SHBODEFS(HBOTPERI) if it is still missing
  • HBOAKCUN – Analytics – KPM CICS

STEPLIB should contain the CDPz SHBOLOAD data
set.

Screen capture showing           contents of STEPLIB

Screen capture showing contents of STEPLIB

HBOLOG should contain the SMF data set you want to collect.

Screen capture showing           contents of HBOLOG

Screen capture showing contents of HBOLOG

After running the job, you can check statistics in the output:

Screen capture showing           output statistics

Screen capture showing output statistics

Db2 Analytics Accelerator load

The Db2 Analytics Accelerator Loader for z/OS V2.1 is used to load the Db2
internal format data sets, created by the System Data Engine (SDE),
directly into the Db2 Analytics Accelerator without the data residing in
Db2 for z/OS. Jobs are provided by CDPz.

Modify and submit jobs from SDRLCNTL to the LOCAL.CNTL data set to use the
Loader to upload the staging data sets to the Accelerator:

  • DRLJA2DD – Analytics – Db2
  • DRLJAKDD – Analytics – KPM Db2
  • DRLJAPMD – Analytics – z/OS Performance
  • DRLJAKZD – Analytics – KPM z/OS
  • DRLJAKCD – Analytics – KPM CICS

There are LOAD statements for each table in the component:

Screen capture showing           LOAD statements

Screen capture showing LOAD statements

Job output contains statistics about each statement:

Screen capture showing           statistics about each statement

Screen capture showing statistics about each statement

Combine COLLECT and LOAD

You can also combine CDPz Collect and Db2 Analytics Accelerator Load
into one JCL.

TDSz collect

You should still use TDSz COLLECT for non-analytics components, but exclude
the A_* tables.

Screen capture showing use of         TDSz COLLECT for non-analytics components, excluding A_* tables

Screen capture showing use of TDSz COLLECT for non-analytics components, excluding A_* tables

Implementation not using Accelerator-only tables

As mentioned previously, there are a couple of ways you can implement
sending SMF data directly to Db2 Analytics Accelerator tables. This section explains how you
can do it in phases. When you install the analytics component in TDSz, the
tables are created first in Db2, and then you can move the tables into
Db2 Analytics Accelerator when ready. In the previous section (“Implementation using Accelerator-only tables”), when you
installed analytics components in TDSz, the tables were created directly
into Db2 Analytics Accelerator.

Phase 1: Run analytics with the tables in Db2

In Phase 1, you run with analytics component tables created in Db2. You can
even install analytics components if you don’t have an Db2 Analytics Accelerator or Common Data
Provider for z/OS. Later, when ready, you can enable the tables for
acceleration and move them to Db2 Analytics Accelerator.

Bind the Db2 plan used for Tivoli Decision Support

After applying the required PTF and running all the actions, including a
refresh of system tables, you need to rebind the Db2 plan used for Tivoli
Decision Support by specifying the BIND option
QUERYACCELERATION(ELIGIBLE) or
QUERYACCELERATION(ENABLE).

You can use the DRLJDBIN sample from the SDRLCNTL data set (the one used to
create the TDSz database).

//DBINIT   EXEC PGM=IKJEFT01                                      
//STEPLIB  DD  DISP=SHR,DSN=hlq_db2.SDSNEXIT             
//         DD  DISP=SHR,DSN=hlq_db2.SDSNEXIT
//DBRMLIB  DD  DISP=SHR,DSN=hlq_tds.SDRLCNTL(DRLPSQLX)     
//SYSPRINT DD  SYSOUT=*                                           
//SYSTSPRT DD  SYSOUT=*                                           
//SYSTSIN  DD  *                                                  
 DSN SYSTEM(DSN)                                                 
   BIND PACKAGE(DRLPLAN) MEMBER(DRLPSQLX) -                       
     ACTION(REPLACE) ISOLATION(CS) ENCODING(EBCDIC)  
     QUERYACCELERATION(ELIGIBLE)                
   BIND PLAN(DRLPLAN) PKLIST(*.DRLPLAN.*) -                       
     ACTION(REPLACE) RETAIN                                       
                                                                  
   RUN PROGRAM(DSNTIAD) PLAN(DSNTIA12) -                          
     LIB('hlq_db2.RUNLIB.LOAD')                          
 END

Modify the TDSz Profile data set

Now, in this scenario, you are not using AOT, so you don’t have to modify
DRLFPROF; you can just leave it with its
defaults:

/*---------------------------------------------------------------*/
/* IDAA Enablement                                               */
/*---------------------------------------------------------------*/
def_useaot         = "NO"                                            
def_accelerator = "HOLIDAA"                                        
def_timeint     = "T"
  • Change def_useaot to “YES” if you want to use
    Accelerator-only Table. Leave it at “NO” if you want to decide
    later.
  • def_accelerator = “xxxxxxxx” (the name of the
    Accelerator).
  • def_timeint = “H” | “S” |
    “T”

    • “H” – timestamp is rounded to an hourly
      interval.
    • “S” – timestamp is rounded to a second’s
      interval.
    • “T” – timestamp is the actual timestamp
      (default).

Customize the LOOKUP definitions

Now, if necessary, you should customize the LOOKUP definitions of analytics
components to reflect the contents of classic lookup tables. Each
analytics component lookup table is based on a non-analytics component
lookup table. If you are planning to run analytics components from Db2,
you can customize LOOKUP definitions members by copying the appropriate
member from the SDRLDEFS data set to the LOCAL.DEFS data set. You can
decide to update LOOKUP tables after installing the appropriate
component.

You should customize lookup tables if you are using Tivoli Decision Support
for z/OS to collect and populate the component tables on Db2 for z/OS (in
this scenario, you might be using TDSz Collect). Also, you should
customize the lookup tables if you are using Tivoli Decision Support for
z/OS reporting (you might be doing that in this scenario).

Table 2. TDSz Lookup table members
Tivoli Decision Support for z/OS member name Tivoli Decision Support for z/OS components table name Tivoli Decision Support for z/OS analytics components table
name
DRLTA2AP DB2_APPLICATION A_DB2_APPLICATION
DRLTA2AC DB2_ACCUMAC A_DB2_ACCUMAC
DRLTALUG USER_GROUP A_USER_GROUP
DRLTALKP KPM_THRESHOLDS A_KPM_THRESHOLDS_L
DRLTALW2 MVS_WORKLOAD2_TYPE A_WORKLOAD2_L
DRLTALDA MVSPM_DEVICE_ADDR A_DEVICE_ADDR_L
DRLTALUT MVSPM_UNIT_TYPE A_UNIT_TYPE_L
DRLTALMI MVS_MIPS_T A_MIPS_L
DRLTALSP MVS_SYSPLEX A_SYSPLEX_L
DRLTALWL MVS_WORKLOAD_TYPE A_WORKLOAD_L
DRLTALW2 MVS_WORKLOAD2_TYPE A_WORKLOAD2_L
DRLTALTR MVSPM_TIME_RES A_TIME_RES_L

Install analytics components

As with any other component, you install analytics components using
Administration Dialog > Components (1.2).

  1. Select Reports.Screen capture showing           selection of Reports
    Screen capture showing selection of Reports
  2. Select Components.Screen capture showing           selection of Components
    Screen capture showing selection of Components
  3. Select the component(s).Screen capture showing           selection of specific components
    Screen capture showing selection of specific components
  4. Press F6 to begin the installation.Screen capture showing           pressing of F6
    Screen capture showing pressing of F6

Now, you submit batch jobs, and the components are created. The tables are
created in Db2. All analytics tables start with A_.

Screen capture showing tables are created in Db2

Screen capture showing tables are created in Db2

Also, there are some sample reports that you can use through reporting
dialog. All analytics sample report IDs start with A.

Screen capture showing sample         reports

Screen capture showing sample reports

Because analytics components are based on classic components, it should not
be difficult to copy any existing reports that run on a classic table and
convert it to run on an analytics table.

You can work with analytics components as with any other component at this
time.

Phase 2: Run analytics components with tables in Db2 Analytics Accelerator

In Phase 1, you installed analytics components with tables created in Db2;
they might already be enabled for acceleration. Now, in Phase 2, you are
ready to move the tables to Db2 Analytics Accelerator, so they will become Accelerator Only
Tables (AOT).

Add the tables from analytics components to the
Accelerator

Modify and submit members from SDRLCNTL to add the tables to the
Accelerator (note that this sample uses AQTSCALL compiled from
SAQTSAMP):

DRLJAKDA – Analytics – KPM Db2

DRLJA2DA – Analytics – Db2

DRLJAKZA – Analytics – KPM z/OS

DRLJAPMA – Analytics – z/OS

DRLJAKCA – Analytics – KPM CICS

Note: If you don’t have AQTSCALL compiled, you can use the
sample job AQTSJI03 in SAQTSAMP.

Move the lookup table contents from analytics components to the
Accelerator

Modify and submit members from SDRLCNTL to move the contents of the lookup
tables to the Accelerator:

DRLJA2DK – Analytics – Db2

DRLJAKZK – Analytics – KPM z/OS

DRLJAPMK – Analytics – z/OS

Customize LOOKUP definitions

Now, you should customize the LOOKUP definitions of analytics components to
reflect the contents of classic lookup tables, if necessary.

Each analytics component lookup table is based on a non-analytics component
lookup table.

You should customize members of the Common Data Provider if you are
collecting data only into the Db2 Analytics Accelerator and not having
the data reside on Db2 for z/OS. You will configure the lookup tables in
the Common Data Provider to z/OS. In this case, you will be collecting
data into Db2 Analytics Accelerator. It would be best to copy members from SHBODEFS data set to
a newly created LOCAL.DEFS data set. See Table
1.

COLLECT for IDAA_ONLY tables

Note: This section is identical to “COLLECT for IDAA_ONLY
tables
” above. It is repeated here for your convenience.

Now, you are ready to collect SMF data to Db2 Analytics Accelerator.

CDPz collect

First, you should run JCL job(s) provided with CDPz to generate Db2
internal format data for the tables created for the Accelerator by the
analytics components. Modify and submit the following jobs from SHBOCNTL
to the LOCAL.CNTL data set to create staging data sets from the supplied
SMF log data set:

  • HBOA2DUN – Analytics – Db2
  • HBOAKDUN – Analytics – KPM Db2
  • HBOAPMUN – Analytics – z/OS Performance
  • HBOAKZUN – Analytics – KPM z/OS
  • HBOAKCUN – Analytics – KPM CICS

STEPLIB should contain the CDPz SHBOLOAD data
set.

Screen capture showing contents           of STEPLIB

Screen capture showing contents of STEPLIB

HBOLOG should contain the SMF data set that you want to collect.

Screen capture showing contents           of HBOLOG

Screen capture showing contents of HBOLOG

After running the job, you can check statistics in the output:

Screen capture showing         statistics

Screen capture showing statistics

Db2 Analytics Accelerator load

The Db2 Analytics Accelerator Loader for z/OS V2.1 is used to load the Db2
internal format data sets, created by the SDE, directly into the Db2
Analytics Accelerator without the data residing in Db2 for z/OS. Jobs are
provided by CDPz.

Modify and submit jobs from SDRLCNTL to the LOCAL.CNTL
data set to use the Loader to upload the staging data sets to the
Accelerator:

  • DRLJA2DD – Analytics – Db2
  • DRLJAKDD – Analytics – KPM Db2
  • DRLJAPMD – Analytics – z/OS Performance
  • DRLJAKZD – Analytics – KPM z/OS
  • DRLJAKCD – Analytics – KPM CICS>

There are LOAD statements for each table in the component:

Screen capture showing LOAD         statements

Screen capture showing LOAD statements

Job output contains statistics about each statement:

Screen capture showing         statistics for each statement

Screen capture showing statistics for each statement

Combine COLLECT and LOAD

You can also combine CDPz Collect and Db2 Analytics Accelerator Load
into one JCL.

TDSz collect

You should still use TDSz COLLECT for non-analytics components, but exclude
the A_* tables.

Screen capture showing use of TDSz COLLECT for non-analytics components, excluding A_* tables

Screen capture showing use of TDSz COLLECT for non-analytics components, excluding A_* tables

Conclusion

This tutorial showed you how to implement integration between TDSz, CDPz,
and Db2 Analytics Accelerator to send SMF data directly to Db2 Analytics Accelerator using Accelerator Only Tables.
You explored two different approaches:

  • By creating tables using TDSz schema directly in Db2 Analytics Accelerator when you install
    the analytics component
  • In phases, by creating tables first in Db2 as you do with
    non-analytics components of TDSz and then moving the tables to
    Db2 Analytics Accelerator

In both cases, you need a Common Data Provider for z/OS, which collects
SMF data into Db2 Analytics Accelerator.


Downloadable resources

Related topics

Credit: IBM

Previous Post

Hacker Stole Data of Over 70% Bulgarian Citizens from Tax Agency Servers

Next Post

Strategies for Engaging Millennial and Gen Z Consumers

Related Posts

Istio 1.8 focuses on usability and upgrades – IBM Developer
Technology Companies

The Eclipse Foundation’s move to Europe will bring open standards and open source together – IBM Developer

January 15, 2021
Six courses to build your technology skills in 2021 – IBM Developer
Technology Companies

Kafka Monthly Digest – December 2020 and 2020 in review – IBM Developer

January 14, 2021
Six courses to build your technology skills in 2021 – IBM Developer
Technology Companies

Learn new skills while working on volunteer projects – IBM Developer

January 8, 2021
Six courses to build your technology skills in 2021 – IBM Developer
Technology Companies

IBM Advance Toolchain for Linux on Power 13.0-3 released!

January 6, 2021
Six courses to build your technology skills in 2021 – IBM Developer
Technology Companies

Following the data science methodology – IBM Developer

January 6, 2021
Next Post
Strategies for Engaging Millennial and Gen Z Consumers

Strategies for Engaging Millennial and Gen Z Consumers

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

NSA warns against using DoH inside enterprise networks
Internet Security

NSA warns against using DoH inside enterprise networks

January 16, 2021
NSA Suggests Enterprises Use ‘Designated’ DNS-over-HTTPS’ Resolvers
Internet Privacy

NSA Suggests Enterprises Use ‘Designated’ DNS-over-HTTPS’ Resolvers

January 16, 2021
Machine Learning Algorithms To Help Identify, Study, and Predict Earthquakes
Machine Learning

Machine Learning Algorithms To Help Identify, Study, and Predict Earthquakes

January 16, 2021
Joker’s Stash, the internet’s largest carding forum, is shutting down
Internet Security

Joker’s Stash, the internet’s largest carding forum, is shutting down

January 16, 2021
Joker’s Stash, The Largest Carding Marketplace, Announces Shutdown
Internet Privacy

Joker’s Stash, The Largest Carding Marketplace, Announces Shutdown

January 16, 2021
Key Factors to Keep in Mind before Selecting an AutoML Platform
Machine Learning

Key Factors to Keep in Mind before Selecting an AutoML Platform

January 16, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • NSA warns against using DoH inside enterprise networks January 16, 2021
  • NSA Suggests Enterprises Use ‘Designated’ DNS-over-HTTPS’ Resolvers January 16, 2021
  • Machine Learning Algorithms To Help Identify, Study, and Predict Earthquakes January 16, 2021
  • Joker’s Stash, the internet’s largest carding forum, is shutting down January 16, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates