Skip to main content

Pentaho (Community Edition)

I.                   Objectives

This course is focused on the use of Pentaho components to perform data transformation processes, modeling data sources and visualizing them using classic reports, on-demand reports and control panels.

At the end of the course, the student will have sufficient knowledge to:

Install the product and the corresponding Java version.

Perform data transformation (ETL) from different data sources to different destinations.

Model a star data source with snowflake.

Create online and classic (pixel-perfect) reports.

Use parameters to create user-configurable reports.

Create control panels by accessing different data sources.

II. Requirements

SQL knowledge.

Optional:

Knowledge of other ETL tools.

III.Duration

25 hours.

IV. Methodology

The course is developed through theoretical presentation accompanied by practical demonstrations and explanations of the results obtained.

The student carries out the creation of objects (transformations and PDI jobs, PDR reports, PSW schemes, online reports with Saiku or other tools, control panels with CDE) with different data sources, mainly relational (MySQL).

Resolution of doubts about the concepts presented.

V.               Content

Introduction to Pentaho Community Edition and its components.PCE components.Java requirements.

Installation of components.

PDI. Pentaho Data Integrator.Connections to databases.Import and export using tables.Import and export from plain text files, csv, Excel, xml, etc.Using calculator and formulas.Using lookup, group by, split, pivot.Modifying Kettle properties.Creating jobs.Using variables.Using parameters.Job flow and error management.

Using PDI from terminal.

PSW. Pentaho Schema Workbench.Modeling dimensional schema (star model) with normalization (snowflake).Publishing schemas.Defining fact table.Adding dimensions.Creating hierarchies.Modeling snowflake.Creating calculated measures.Creating calculated members.

Using summary tables (aggregate).

BIS. Pentaho BI ServerConfiguring and starting server.Server logs.Accessing the system.Configuring users.

Reports on request.

JPivot.

Pivot4J.Saiku.Download and upload files.

Planner.

PRD. Pentaho Report DesignerCreating classic reports (pixel-perfect).Using headers, footers, document setup, label area, data area.Pagination and including date and time in report.Formatting columns.Creating groupings.Creating totals.Using user filters:Simple filters.Using filters with text strings.Key-value filters.Nested parameters.Default values.Options for “no data found”.Subreports in reports.Charts in reports:Pie chart.Bar chart.Subreports with charts.Spark-line charts.

Pivot tables in reports.

CDE. Community Dashboard EditorDashboard concept.Introduction to the three layers of CDE.Page layout.Components.Data sources.Creating CCC charts.

Incorporating charts and other data visualization elements into the dashboard.

Oracle Analytics Server: Analytics and P.C.

I.                   Objectives

This course is focused on creating reports (analysis) and dashboards in Oracle Analytics Server

Upon completion of the course, the student will have sufficient knowledge to:

Create reports with tables and graphs using user and dynamic filter options, groupings and advanced selections with hierarchy members.

Create different types of graphs and configure their properties. Add them to reports and establish master-detail links.

Use of key process indicators, KPIs, and follow their trend based on dimensions.

Create dashboards, with dynamic filters using prompts and conditioning the display of objects.

Use of tools: Delivers, for automatic and scheduled delivery of reports; briefing books

Use of the new Oracle Analytics Server tools for data visualization.

II. Requirements

There are no specific requirements.

Optional:

Knowledge of SQL and data analysis.

III. Duration

30 hours.

IV. Methodology

The course is developed through theoretical presentation accompanied by practical demonstrations and explanations of the results obtained.

The student carries out the creation of analyses using different characteristics of the product of varying complexity. A control panel is developed with the previous analyses.

Resolution of doubts about the concepts presented.

V.               Content

Introduction to Oracle Analytics Server and Oracle Analytics Cloud.

Architecture.

Data modeling. Create and publish a model. Create fact and dimension tables. Create hierarchies.

Additional data modeling techniques.

User interface. Subject areas and relationships with the repository. Main page, favorites and catalog.

Introduction to the analysis editor.

Creating viewsCreating analysesOrdering dataViews in analysesEditing charts

Creating calculated items

FormattingFormatting columns

Conditional formatting

Filters and promptsData filters

Using prompts and variables

Grouping and selection stepsUsing selection stepsCreating and using groups

Creating and using calculated items

Advanced use of the analysis editorView editorsCompound Layout panel featuresUsing sections (page, section, measures, rows, columns, exclusion)

Chart editorDefining general chart propertiesModifying a chart’s styleUsing titles and labelsConditional formattingAdding totals and subtotalsUsing bins (combining values)

Using the master-detail function

Graphical ObjectsAdvanced OptionsUsing KPIs

Additional Graphical Objects

Dashboards.Interactive Analytics.Creating Dashboards.Expressions.Adding Subpages.Adding Dashboard Actions.Using Prompts.Personalization.

Adding Content.

Briefing Books.Creating Briefing Books.

Using Briefing Books.

Delivers.Creating Automatic Content with Deliveries.

Creating Agents.

Cache Management.

Oracle Data Visualization.Projects and Data.

Creating Visualizations and Stories.

Pentaho Data Integration – Kettle (CE)

I.                   Objectives

This course is focused on the use of the Pentaho ETL component: Pentaho Data Integration.

Upon completion of the course, the student will have sufficient knowledge to:

Install the product and the corresponding Java version.

Perform data transformation (ETL) from different data sources to different destinations.

II. Requirements

SQL knowledge.

Optional:

Knowledge of other ETL tools.

III. Duration

20 hours.

IV. Methodology

The course is developed through theoretical presentation accompanied by practical demonstrations and explanations of the results obtained.

The student carries out practices with the product for each concept explained. Different data sources will be used, including text files, spreadsheets and databases, mainly relational (MySQL).

Resolution of doubts about the concepts presented.

V.               Content

Introduction to Pentaho Community Edition and its components. PCE components. Java requirements.

Java and PDI installation.

PDI. Pentaho Data Integrator. Databases:Connections to databases.Using shared connections.Basic components of transformations:Creating transformations.Importing and exporting using tables.Importing and exporting from plain text files, csv, Excel, xml, etc.Using calculator and formulas.Selecting columns and data filters.Using lookup, group by, split, pivot.Merge join.Mapping.Jobs, variables and properties:Modifying Kettle properties.Creating jobs.Using variables.Using parameters.Job flow and error management.Additional elements:Using PDI from terminal.Running code in database.Dynamic processing of files.Moving files in the operating system.Writing in the PDI “log”.Waiting for files (filewatcher).Checking for file existence.Checking for table existence (BBDD).

Practices with transformations and jobs.

QlikView: Designer

I.                   Objectives

This course focuses on the basic analysis part of the QlikView tool.

It provides the basis for understanding the philosophy of the associative model proposed by QlikView and the objects that allow access to data and display it in different formats.

At the end of the course, the student will have sufficient knowledge to:

Access different types of data (both file format and structure).Create tables and graphs for analyzing information (bars, lines, pie charts, etc.).Understand the associative model of QlikView elements.

Customize the display of analysis objects.

II. Requirements

No specific requirements are needed beyond having used other simple analysis tools (Microsoft Excel, for example).

Knowledge of SQL and the use of plain text files can facilitate understanding of the subject.

III.Duration

15 hours.

IV. Methodology

The course is developed through theoretical presentation accompanied by practical demonstrations and explanations of the results obtained.

The student carries out the modeling of different data sources, to test the characteristics of the product, in addition to designing analysis sheets and control panels by inserting selection elements and graphics.

Resolution of doubts about the concepts presented.

V.               Content

Introduction to QlikView and data analysis.Basic BI concepts.

Data models.

Main features of QlikView. Data modeling. Associative model. Importing data from flat files. Importing data from spreadsheets. Importing data from databases. ODBC. OLE DB. Data loading scripts. LOAD command.Using inline tables.Using star models.Using snowflakes.Using normalized models.Table viewer window.Expression editor window.

Creating content.Document setup.Sheet properties.General characteristics of an object.Using cyclic expressions.Object configuration:Titles.List boxes.Table boxes.Bookmark.Multiple selection box.Current selection box.Inserting images.Creating a “Top 10” object.Creating analysis sheets and control panels.Linking and copying objects.Indicator charts.Charts:Lines.Pie.Bars.Radar.KPIs.Using expressions in charts.Hierarchical charts.Using bookmarks.

Advanced buttons and objects.

Tableau

I.                   Objectives

This course is focused on the analysis part of the Tableau tool.

It provides the necessary knowledge to use the data access and analysis tools.

At the end of the course, the student will have sufficient knowledge to:

Access data that resides in different sources.

Create analysis using Tableau.

II. Requirements

No specific requirements are needed beyond having used other simple analysis tools (Microsoft Excel, for example).

Knowledge of SQL and the use of plain text files can facilitate understanding of the subject.

III. Duration

20 hours.

IV. Methodology

The course is developed through theoretical presentation accompanied by practical demonstrations and explanations of the results obtained.

The student carries out practical exercises in which he/she models different data sources to test the characteristics of the product, in addition to using the data analysis functions.

Resolution of doubts about the concepts presented.

V.               Content

Introduction to Business Intelligence, Data Warehouse and Tableau. Basic concepts of B.I. Data models. Main characteristics of Tableau.

Data access and preparation.

Use of plain text files.

Pivoting dataset.Fixed length plain text files.Plain text files with field separator character.Using Microsoft Excel as a data source.Joining data using Microsoft Excel.Using Google Sheets.Basic use of geographic information.Complementing data using extractions.LEFT / RIGHT / FULL data joins.CROSS data joins.BLENDING data joins.Data analytics:Data tables.Context filters.Sorting and hierarchical sorting.Charts:Basic.Bullet Chart.Bar in bar.Heat map.Other charts.Calculated elements:Row level calculations.Grouped calculations.Null values ​​in grouped calculations.Dimensional level calculations.Parameters.Trend lines.Clustering.Forecast.Dashboard.Creating the dashboard. control.Using visualizations.Inserting images.

Story.Creating a story.Adding elements to each sheet.

Adding comments to each sheet.

© Juan de Juan.