This documentation is not maintained. Please refer to doc.castsoftware.com/technologies to find the latest updates.

Summary: This document provides information about the extension that provides data column access support for the SQL Analyzer extension (from version ≥ 2.4)

Extension ID

com.castsoftware.datacolumnaccess

What's new?

Please see Data Column Access - 2.1 - Release Notes for more information.

Description

The Data Column Access extension provides support for:

  • Data sensitivity policy based on Data Sensitivity indicators
  • column access links with the SQL Analyzer ≥ 2.4

In what situation should you install this extension?

  • If you need to check that the application is data sensitive compliant using the indicators
  • If you need to see Access Read and Access Write links for columns on SELECT, INSERT, UPDATE, DELETE and MERGE statements
  • If you need to see Access Write links between columns and Cobol data, from INSERT, UPDATE and SELECT .. INTO statements
  • If you need to see Access Write links for columns selected in the DECLARE cursor statement and the Cobol data fetched in FETCH cursor statement

CAST AIP compatibility

This extension is compatible with:

CAST AIP releaseSupported
8.3.x(tick)
8.2.x(tick)
8.1.x(tick)
8.0.x(tick)
7.3.x(error)

Supported DBMS servers used for CAST AIP schemas

This extension is compatible with the following DBMS servers used to host CAST AIP schemas:

CAST AIP releaseCSSOracleMicrosoft
All supported releases(tick)(tick)(tick)

Supported client languages

LanguageSupported?Notes

Mainframe Cobol

(tick)

Visual Basic

(tick)

.NET

(tick)

Java/JEE

(tick)
C/C++(tick)

Provided that queries are located in EXEC SQL statements and the com.castsoftware.cpp extension was used to analyze the client code, then links to columns will be created if they exist.

Python

(tick)

Prerequisites

(tick)An installation of any compatible release of CAST AIP (see table above)
(tick)An installation of SQL Analyzer extension (from version ≥ 2.4)

Download and installation instructions

Please see:

The latest release status of this extension can be seen when downloading it from the CAST Extend server.

What results can you expect?

  • Links are created for transaction and function point needs.
  • You can expect the following links on the DDL side within the same sql file:
    • accessRead from View / Procedure / Function / Trigger / Event  to Column
    • accessWrite from Procedure / Function / Trigger / Event  to Column
  • You can expect the same links for the following client side, only if the server-side code has been analyzed with the SQL Analyzer extension and only if they have dependencies with the SQL Analyzer analysis results:

COBOL PB VB .NET JAVAC/C++/OBJC IOS PYTHON

  • When Cobol Data are created during Cobol Analysis and the production option Save data and links to other data is activated, you can expect to the following links:
    • accessWrite from Cobol Data  to Column, for the case of insert into and update. E.g. : Insert into table1 (col1, col2, ...) values (:data1, data2) the data1, data2 will access write the col1, col2. Update table1 set col1 = :data1, col2 = :data2 the data1, data2 will access write col1, col2
      • Sometimes those links have bookmarks but sometimes the bookmarks are missing, see here why : Limitations. When the bookmark exists, it will pointing on the Cobol Data definition and not on the SQL statement.
    • accessWrite from Column to Cobol Data, for the case of select into. E.g. : Select col1, col2, ... into :data1, :data2 the col1, col2 will access write the data1, data2. 
      • Those links have no bookmarks, see here why : Limitations.
    • accessWrite from Column selected in DECLARE cursor to Cobol Data fetched in FECTH cursor. E.g. : declare toto_curs cursor for select col_toto from toto, followed by fetch toto_curs into :var1. The col_toto will access write the var1. 
      • Those links have no bookmarks, see here why : Limitations.

Special notes about Links on client side

  • For Java client-side code, SQL statements used in parameters of methods including a SQL parametrization rule are analyzed.
Example of call to a parametrized method
class Foo
{
   final static String TABLE_NAME = "Person";

	void method()
	{
    	String query = "select * from " + this.TABLE_NAME;
    	java.sql.Statement.execute(query );
	}
}
  • But 'queries' visible in the DLM (that need reviewing) are not analyzed:
Example ofa query visible in the DLM
class Foo
{
	// not passed to an execute something
	private final static String text = "select name from Person";
}
  • Explicit queries used in an ORM context are analyzed (or not) based on if they are visible in Enlighten

  • COBOL EXEC SQL queries are analyzed

  • SQL queries founded in Python code are analyzed

  • SQL queries founded in .properties (Java Property Mapping objects) are analyzed

Examples

Select Into, Move, Update

The value we have in the column COL_TOTO will be written in the Cobol data VAR1. What we have in VAR1 will move into the Cobol Data VAR2. At the end, VAR2 will update the column COL_TOTO, here is the source code :

       IDENTIFICATION DIVISION.                                         00010000
       PROGRAM-ID.   TESTMOVE.                                          00020000
       DATA DIVISION.                                                   00340000
       WORKING-STORAGE SECTION.                                         00010000
       01 VAR1 PIC X.                                                   00010000
       02 VAR2 PIC X.                                                   00010000
                                                                        00010000
       PROCEDURE DIVISION.                                              00010000
                                                                        00010000
       EXEC SQL                                                         00010000
        SELECT COL_TOTO                                                 00010000
            INTO :VAR1                                                  00010000
        FROM TOTO                                                       00010000
       END-EXEC                                                         00010000
                                                                        00010000
       MOVE VAR1 TO VAR2                                                00010000
                                                                        00010000
       EXEC SQL                                                         00010000
       UPDATE TITI                                                      00010000
        SET COL_TITI = :VAR2                                            00010000
       END-EXEC                                                         00010000


Insert Into

Update

Select into

 


Declare cursor ... Fetch cursor ... Move ... Insert

The value we have in the column COL_TOTO is selected in DECLARE TOTO_CURS, than it will be written during the FETCH TOTO_CURS in the Cobol data VAR1. What we have in VAR1 will move into the Cobol Data VAR2. At the end, VAR2 will be inserted in the table TITI, column COL_TITI, here is the source code :

       IDENTIFICATION DIVISION.                                         00010000
       PROGRAM-ID.   TESTCURSOR.                                        00020000
       DATA DIVISION.                                                   00340000
       WORKING-STORAGE SECTION.                                         00010000
       01 VAR1 PIC X.                                                   00010000
       02 VAR2 PIC X.                                                   00010000
                                                                        00010000
       PROCEDURE DIVISION.                                              00010000
                                                                        00010000
       EXEC SQL
        DECLARE TOTO_CURS CURSOR FOR                                    00010000
        SELECT COL_TOTO                                                 00010000
        FROM TOTO                                                       00010000
       END-EXEC                                                         00010000
                                                                        00010000
       EXEC SQL                                                         00010000
       FETCH TOTO_CURS                                                  00010000
        INTO :VAR1                                                      00010000
       END-EXEC                                                         00010000
                                                                        00010000
       MOVE VAR1 TO VAR2                                                00010000
                                                                        00010000
       EXEC SQL                                                         00010000
       INSERT INTO TITI (COL_TITI)                                      00010000
        VALUES (:VAR2)                                                  00010000
       END-EXEC                                                         00010000
                                                                        00010000
                                                                        00010000


DeclareVarMoveFetchInsert.png

Sensitivity Indicator

To enable Sensitivity Indicator for columns, it is necessary to provide the analyzer with a configuration file (*.datasensitive extension), in the same folder alongside the source code. The pattern to be used in the configuration file is: <database_name>.<schema_name>.<table_name>.<column_name>=<Sensitivity_indicator>.

The previous *.gdpr file extension for the configuration file is still accepted for the compatibility reasons, but we recommend you to switch to the new one.

Data type

Description

datacolumnaccess  >=  2.0.0-funcrel

SQL Analyzer version >= 3.5.3-funcrel

datacolumnaccess  <= 1.0.0-funcrel

SQL Analyzer version <= 3.5.2-funcrel


Highly SensitiveGDPR - Very sensitive

The information stored in the column is very sensitive on its own from a data sensitivity point of view without being correlated with other information. The type of information involved is for example: Credit card number, health insurance number, passport number etc.

Very SensitiveGDPR - Sensitive

The information in the column is sensitive because, when correlated with other information, it became very sensitive. The type of information involved is for example: the address, the phone number etc.

SensitiveSecurity

The information stored is critical for the security of the platform, such as the administrators list, etc.

Not sensitiveNot concerned

The column is not involved in data sensitivity legislation.

The value by default of each of the column is "Not sensitive" (the last one in the list here below) therefore the customer has only to parametrize the column concerned by the first three values in the list.

Template

You can also find attached a real example, for the AdventureWorks database: 

Examples

Example
Schema1.Table1.Col1=Highly sensitive
Schema1.Table1.Col2=Very Sensitive
Schema1.Table1.Col3=Sensitive
......

When the Sensitivity Indicator should apply to the same column name from all tables, in a specific schema, use a "*" wildcard:

Example
Schema1.*.Age=Very sensitive

When we don't know the schema name, but we do know the Table and Column names, e.g.:

Example
Table1.Col1=Highly sensitive

When the Sensitivity Indicator should apply to all columns named Age from all tables / schemas / databases, e.g.:

Example
*.Age=Very sensitive

Performance

When using the Data Column Access extension 1.0.0, analysis execution time is slightly different (usually slightly longer). Here are some details of the performance we have experienced using AIP Core 8.3.24 and SQL Analyzer 3.4.4-funcrel:

Use caseTechnical detailsExecution details

100% SQL (Teradata)

1 single SQL file, file size 14.5MB

Detected variant: Ansisql

129,529 LOC

Analysis duration for 1 analysis unit:

  • without Data Column Access extension = 20 min 20 sec
  • with Data Column Access extension = 21 min 26 sec  

No data sensitivity

988 access links are added

Cobol + SQL (Db2)

SQL: 1 single file, file size 289KB

Detected variant: Db2

Cobol: 1,996 files, 11 folders, 52MB

443.8K LOC

Analysis duration for 2 analysis units:

  • without Data Column Access extension = 19 min 19 sec
  • with Data Column Access extension = 19 min 27 sec

No data sensitivity

15,245 access links are added, all of them on the Cobol side

.NET + SQL (Microsoft SQL Server)

SQL: 6 files (3 files of 6 are .sqltablesize), 1 folder, 4.08 MB

Detected variant: SQL Server

.NET: 14,062 files, 2,747 folders, 1.04GB

Assemblies: 218 files, 214 folders, 68.2MB

698K LOC

Analysis duration for 185 analysis units:

  • without Data Column Access extension = 2 hours 9 min 34 sec
  • with Data Column Access extension = 2 hours 5min 27 sec

No data sensitivity

From a total of 2,562 added access links, 6 are detected on the .NET side