BDT Policy File
Refer to the Sample BDT Policy File for details.
Top-level Policy Attributes
BDT Policy File Top-level Policy Attributes
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
id | No default | Mandatory | A unique identifier for the policy. |
name | No default | Mandatory | A unique name for the policy. |
description | String | Optional | Description of the policy. |
caseSensitive | false | Optional | True or false. If set to true , all the database objects mentioned in the policy file will be processed in the same case as input.If set to false , all the database objects mentioned in the policy file will be automatically converted to CAPITAL case.Note: Setting caseSensitive parameter to true allows SQL reserved words to be used as database objects in BDT.Refer to Sample BDT Policy File for details. |
trimSpaces | false | Optional | True or false. If set to true , it will trim initial and trailing white spaces from the data present in the column to be transformed.If set to false , the flag is disabled, and no processing is done on the data present in the column to be transformed. |
enableHexLiteral | false | Optional | Input type: Boolean If set to true , the ciphertext will be enclosed with hex literal. For example, x'<ciphertext>' |
source | No default | Mandatory | Defines details of data input. For databases, such details include connection parameters. For files, such details include file type (CSV_FILE or FIXED_LENGTH_FILE), delimiter, encoding, and so on. |
destination | No default | Mandatory (if inPlaceUpdate attribute is set to false ) | Defines details of data output, such as where to write transformed data. For databases, such details include connection parameters. If the policy file defines an in-place transformation of data (with inPlaceUpdate attribute set to true), the destination value is ignored if defined. For files, such details include file type (CSV_FILE or FIXED_LENGTH_FILE), delimiter, encoding, and so on. |
tables | No default | Mandatory | List of the tables to be transformed. |
fpemode | ASCII | Optional | ASCII or UTF . (FPE-only) Set fpemode to define which FF3-encoding mechanism to use. ASCII supports only ASCII characters and is faster than UTF . Default is ASCII . Set to UTF when using FPE to encrypt Unicode characters. Note: ASCII encryption mode is separate from ASCII character encoding. |
createBadRecordFile | false | Optional | True or false. Set to true to create a file with .failed extension, otherwise set to false.Note: It is recommended to delete the BadRecordsFile after being used as it contains sensitive information. |
inPlaceUpdate | false | Optional | Set to true to transform data in-place within the database (that is, in-place transformation), rather than the default behavior that creates a new destination table containing the transformed data. If true, the destination , destinationTable , and createDestinationTable attributes are ignored.Note: • For inPlaceUpdate transformation, Primary Key column should be available in the table. If this column is not available, below error message is displayed:“Primary key column not found. Can't perform in-place update.” • Ensure to take the backup of the source table before performing inplaceUpdate. |
Source and Destination Attributes
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
@type | No default | Mandatory | Specify the database type as one of either DB2, ORACLE, MYSQL, SQL_SERVER, HANA. |
connectionurl | No default | Mandatory | Database connection URL. For example: jdbc:mysql://localhost:3306/world. |
username | No default | Mandatory | Database username. |
password | No default | Mandatory | Database password. Use bdt -e option to add encrypted password. |
driverclass | No default | Mandatory | DBC connection driver class. For example: com.mysql.cj.jdbc.Driver. |
"source" :
{
"@type" : "HANA",
"connectionurl" : "jdbc:sap://host:port",
"username" : "",
"password" : "",
"driverclass" : "com.sap.db.jdbc.Driver"
},
"destination" :
{
"@type" : "HANA",
"connectionurl" : "jdbc:sap://host:port",
"username" : "",
"password" : "",
"driverclass" : "com.sap.db.jdbc.Driver"
}
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
@type | No default | Mandatory | CSV_FILE. |
filepath | No default | Mandatory | The file path relative to the inputDir or outputDir path in the config file. |
delimiter | No default | Mandatory | The character that is used to divide one column from the next in the input file. Any printable ASCII character except for backslash (\) can be used as a delimiter, in addition to space ( ), tab, and Ctrl-A. It is mandatory to specify a delimiter in source. Note: The delimiter and qualifier characters cannot be the same. |
qualifier | No default | Optional | Character to enclose fields that contain a delimiter character. Any printable ASCII character except for backslash (\) can be used as a qualifier. This is used when source is a file and the delimiter character is itself a part of the data to be transformed. So to tell BDT that this is not a delimiter, user can enclose input data by a qualifier and mention qualifier here in this field. Note: The delimiter and qualifier characters cannot be the same. |
columnCount | No default | Mandatory | Number of columns in the input file. This does not necessarily equal the number of columns that have rules defined for them in this bdt.policy file. |
hasHeaderRow | false | Optional | True or false. Default false. If true, the utility will treat the first row as a header row. Values from first row will be treated as column names. The same column names should be mentioned in the transformation action. If false, the utility will treat all the rows as data to be transformed. |
unescapeInput | false | Optional | True or false. Default: false. If true, delimiter and qualifier characters in input preceded by a backslash (\) are considered as normal characters and part of the input data. If you are using the BDT output file as input to another process, set unescapeInput to true. |
encoding | UTF8 | Optional | Used to set character encoding, for example, to enable IBM’s EBCDIC. Input file must use same character encoding. Output and failed record files will be created with the same character encoding. Sample: For EBCDIC encoding, set encoding :"ibm500 " in the policy file.To encrypt ASCII files, you must set encoding: ASCII. Refer to Supported Encodings for details. |
For file-to-file (CSV) transformation, if the delimiter or qualifier is not configured for destination, the delimiter or qualifier configured in the source will be considered.
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
@type | No default | Mandatory | FIXED_LENGTH_FILE. |
filepath | No default | Mandatory | The file path relative to inputDir or outputDir path in config file. |
columnCount | No default | Mandatory | The number of columns a row has. |
encoding | No default | Optional | Used to set character encoding, for example, to enable IBM’s EBCDIC. Input file must use same character encoding. Output and failed record files will be created with the same character encoding. Sample: For EBCDIC encoding, set encoding:" ibm500 " in the policy file.Refer to Supported Encodings for details. |
columnPositionInfo | No default | Mandatory | Column positioning. |
recordLength | No default | Mandatory | Record length. |
lineSeparator | false | Optional | True or false. Is record separated with line separator? |
hasHeaderRow | false | Optional | True or false. Default false. If true, the utility will treat the first row as a header row. Values from this row will be treated as column names. The same column names should be mentioned in the transformation action. |
Tables Attributes
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
sourceTable | No default | Mandatory | Source table name. For database only. Refer to Limitations for details. |
destinationTable | No default | Mandatory (if inPlaceUpdate attribute is set to false ) | Destination table name. For database only. If the policy file defines an in-place transformation of data (with inPlaceUpdate attribute set to true ), the destination value is ignored if defined. By definition, source and destination tables are the same. |
sourceSchema | Username specified in the JDBC connections or the default schema name for the databases that support schema (for example, "dbo for MSSQL") | Optional | Database username or schema (for the databases that support schema) where source table belongs to. |
destinationSchema | Username specified in the JDBC connections or the default schema name for the databases that support schema (for example, "dbo for MSSQL") | Optional | Database username or schema (for the databases that support schema) where destination table belongs to. |
columns | No default | Mandatory | List of the columns to be transformed. Policy rules for transforming one or more columns. For each column where transformation is desired, the bdt.policy file specifies the data transformation action to be performed and a set of parameters related to that action. |
subset | No default | Optional | Table-specific data filter criteria. Filter input data based on these criteria. |
createDestinationTable | false | Optional | True or false. Create destination table if none exists. If in-place transformation is defined (when inPlaceUpdate attribute is set to true ), createDestinationTable is ignored, because no new destination table is created. For File to DB, datatype of columns in the destination table will be varchar(10000) .Note: The createDestinationTable attribute is not supported for HANA databases.Caution: Use only for testing. Do not use in a production environment. Refer to Limitations for details. |
Columns Attributes
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
name | No default | Mandatory | Column name. If hasHeaderRow is set to false, then configure this parameter as "columnN" where N is the position of column.Note: The position of column is starting from 0. For example: If the user wants to transform first column, then the value will be "column0", for second column; the value will be "column1", and so on. |
targetColName | No default | Optional | Name of the new column. Refer to Storing Output of Crypto Operations in a New Column for details. |
srcColumnAction | preserve (if targetColName is set) | Optional | Following are the options: preserve: The source column remains preserved. remove: The source column is removed. empty: The source column is empty, that is, contains no data. Note: If srcColumnAction is set and no targetColName is present, BDT throws an error. |
targetColLength | No default | Mandatory (if targetColName is set during fixed file transformation) | The column length of targetColName . |
action | No default | Mandatory | For each column of input data, the bdt.policy file can specify any of these actions - ENCRYPT , DECRYPT , TOKENIZE , DETOKENIZE , and REKEY .Refer to Attributes for ENCRYPT/DECRYPT Actions and Attributes for TOKENIZE/DETOKENIZE Actions for details. Visit the following for the examples of each action: • ENCRYPT • DECRYPT • TOKENIZE • DETOKENIZE • REKEY Note: If you do not specify any of these actions for a column, the contents of the column are copied to the output file without any transformation. |
config | No default | Mandatory | Configuration for Encryption or Tokenization operations. |
Configuration Attributes
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
@type | No default | Mandatory | Options: AES_CBC_PAD , AES_CTR , FPE , FF1 , FPE_AES , and DESede (signifying the triple operations of DES of encrypt-decrypt-encrypt).To encrypt/decrypt data using either of the NIST-defined FPE formats - FF3 or FF1 , the user must specify the ENCRYPT or DECRYPT policy actions with @type value.The value for FF1 is FF1 , and for FF3 is FPE .Note: The BDT syntax uses FPE as a synonym for FF3. |
key | No default | Mandatory | Name of the encryption key. |
characterSet | No default | Mandatory (if @type is FPE or FF1 or FPE_AES ) | Name of the characterSet that was defined under characterSets in the bdt.config file.Defaults include " latin ", "digits ", "alphabets ", and "alphanumeric ".Users can also define their own characterSets in the bdt.config file.Note: characterSets are not applicable (if @type is AES_CBC_PAD or AES_CTR or DESede ). |
tweak | No default | Mandatory (if @type is FPE or FF1 or FPE_AES ) | It uses the tweekable cipher concept to protect against statistical attacks due to potentially small input/output space. There are two ways of defining tweak: "tweak":<16 characters long HEX encoded string> "tweak":"auto" Either of these ways can be used. Refer to the tweakSource attribute for details.Note: tweak is not applicable (if @type is AES_CBC_PAD or AES_CTR or DESede ). |
tweakSource | No default | Optional | Specifes a column in the source for the tweak , to allow different tweaks to be used for each row.For example: "tweakSource":"columnName" .tweakSource is not supported when inPlaceUpdate is set to true. |
allowSmallInputs | false | Optional | True or false. Default is false. Applicable to FPE, FF1, and FPE_AES. If true, the following scenarios occur: • Data is passed as untransformed if effective data length is less than 2 characters for FPE and FF1, and less than 4 characters for FPE_AES. • Transformation continues if effective data length is greater than 2 characters for FPE and FF1, and greater than 4 characters for FPE_AES. If false, transformation fails if effective data length is less than 2 characters for FPE and FF1, and less than 4 characters for FPE_AES. Note: • allowSingleCharInputs is not applicable to DPG initial encryption as DPG doesn't support single characters.• To provide backward compatibility, allowSingleCharInputs is also supported, but it is recommended to use allowSmallInputs . |
keepleft | No default | Optional | Number of characters to be preserved from left. It is applicable only if @type is FPE or FF1 or FPE_AES . |
keepright | No default | Optional | Number of characters to be preserved from right. It is applicable only if @type is FPE or FF1 or FPE_AES . |
policyVersion | 0 | Optional | Specifies the protection policy version information. If policy version > 0, then the version header in hex will be prepended to the ciphertext. If not, then no policy version header will be prepended to the ciphertext. Note: Applicable only for DPG initial encryption. |
iv | No default | Mandatory (if @type is AES_CBC_PAD or AES_CTR or DESede with mode as CBC ) | Initialization Vector. Refer to supported encryption algorithms for details. Note: • It is recommended to specify random generated IV, and do not reuse any IV. • IV is not applicable (if @type is FPE or FF1 or DESede with mode as ECB ). |
ivSource | No default | Optional | Specifes a column in the source for the IV , to allow different IVs to be used for each row.For example: "ivSource":"columnName" .Value to be specified in hexadecimal format. For example: 01A40933B401A409. If value is not valid hexadecimal, then it is treated as an ASCII string. ivSource is not supported when inPlaceUpdate is set to true.Note: It is recommended to specify random generated IV, and do not reuse any IV. |
inputEncoding | No default | Optional | Specifies how to decode input and create byte array from it. Valid options are BASE2 , BASE16 , BASE64 , UTF8 , UTF16LE , UTF16BE , UTF32LE , or UTF32BE .Note: • BASE2 is supported only with database Blob data type columns.• inputEncoding is not used in any of the FPE algorithm. |
outputEncoding | No default | Optional | Specifies how to encode output byte array to a string. Valid options are BASE16 , BASE64 , UTF8 , UTF16LE , UTF16BE , UTF32LE , or UTF32BE .Note: outputEncoding is not used in any of the FPE algorithm. |
mode | No default | Mandatory (if @type is DESede ) | Specifies the supported algorithm mode, either CBC or ECB . The mode must be specified when using the DESede (that is, the 3DES) algorithm (as specified by the @type attribute).DESede means DES encrypt-decrypt-encrypt, a common synonym for 3DES that summarizes the basic three-step process underlying it. Note: ECB mode is not recommended. |
padding | No default | Mandatory (if @type is DESede ) | Valid options are PKCS5Padding or NoPadding .Note: padding is supported (if @type is DESede ).Refer to Supported Encryption Algorithms for details. |
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
@type | No default | Mandatory | Valid value is TOKENIZE . |
tokenGroup | No Default | Mandatory (if action is TOKENIZE or DETOKENIZE ) | Name of the token group as defined in VTS. |
tokenTemplate | No Default | Mandatory (if action is TOKENIZE or DETOKENIZE ) | Name of the token template as defined in VTS. |
Attributes to Transform a Subset of Records
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
offset | 0 | Optional | Start record index. |
limit | No default | Mandatory (if filtering data for transformation) | Maximum number of records/rows to transform, starting from offset . |
recurrency | 1 | Optional | Select every Nth record. Default is 1 (process every record). |
orderBy | First column | Optional | Order by column, for example: column1 . Not supported for file transformation. |
order | ASC | Optional | ASC (ascending) or DESC (descending) record order. Not supported for file transformation. |
filters | No default | Mandatory (if filtering data for transformation) | Subset-specific filters to be used. |
Attributes to Filter a Subset
Attribute | Default Value | Mandatory/Optional | Description |
---|---|---|---|
selector | No default | Mandatory (if filtering data for transformation) | Column name. |
operator | No default | Mandatory (if filtering data for transformation) | Options: "= ","< ","> ","<= ",">= ","!= ","like ". |
expression | No default | Mandatory (if filtering data for transformation) | Expression to evaluate filter on selector using specified operator. |
The following is a sample policy file excerpt for table attributes (Subset, Offset, Limit, Filters, Selector, Operator, Expression) meant for filtering input data:
"subset":{
"offset":"10",
"limit":1000,
"filters":[
{
"selector":"salary",
"operator":">",
"expression":"10000"
},
{
"selector":"name",
"operator":"like",
"expression":"'p%'"
}
]
}