Creating New Token Formats
This topic guides you to create your own token formats. The createNewFormat() method in the TokenService class and CreateNewTokenFormat in the web service enable you to create a new token format when the predefined formats do not meet your needs. This Java method is overloaded, so you have multiple options for creating new token formats using the Java API. This topic covers the following:
Tip
For useful background information on token formats, refer to Token Formats.
Note
The user may use the obfuscated password or credential. See Creating Obfuscated Data Using Obfuscation Utility for more information.
Create Custom Token Formats with Java Regular Expressions
The createNewFormat API method using regular expression:
public int createNewFormat (String description, String splitter, String splicer);
This method gives Token Administrators the ability to create own custom token format using regular expressions, which should work on any supported data type.
Parameters
String description: User provided information for the custom token format, it can be new format name being created or any other related details.
String splitter: Splits the input plaintext into groups.
String splicer: Combines the groups into resulting token.
Examples for defining splitter:
String FirstTwoLastFourSplitter = "(.{2})(.*)(.{4})";
//Here, FirstTwoLastFourSplitter splits the input data into three groups. First group consists of the first two characters of the input data.
//Third group consists of last four characters, and
//Second group consists of the remaining characters.
String LastFourSplitter = "(.*)(.{4})";
// Here LastFourSplitter splits the input data into two groups. (.{4}) group has the last four characters of the input data and (.*) group has the remaining characters.
Examples for defining splicer:
String splicer1 = "(0 GLOBAL LENGTH=20 LC=PASS)(1 KEEP)(2 RANDOM)(3 KEEP)";
//LENGTH defines the length of the token and
//LC is used to pass/fail Luhn check on token. Possible values are PASS, FAIL or NONE. LENGTH and LC are optional.
String splicer2 = "(1 RANDOM)(2 KEEP)";
Examples for creating custom token format:
int rf1 = _ts.createNewFormat("FirstTwoLastFour", FirstTwoLastFourSplitter, splicer1);
int rf2 = _ts.createNewFormat("LastFour", LastFourSplitter, splicer2);
Generating token values
Example:
String input= "1234-5678-9012-345";
String token = _ts.insert(input, _dbTable, rf1, false);
System.out.println("INPUT VALUE: " + input);
System.out.println("REGEX TOKEN: " + token);
Output is
INPUT VALUE: 1234-5678-9012-345
REGEX TOKEN: 1269-9901-693947-345
Note
If the resulting token values are 10 characters or longer, you have the option to perform a luhn check on them.
New formats are stored in the SFNT_TOKEN_FORMAT table in the database.
Sample Code - Complete
Here is sample code showing how a fairly standard token format is defined using the regular expressions (“regex”) custom formatting rules of the createNewFormat() request:
public static void main(String [] args) throws TokenException
{
String _dbTable = "TV_BULK";
TokenService _ts = new TokenService("maksim", "asdf1234".toCharArray(),
"tmowner", "tmowner".toCharArray());
String FirstTwoLastFourSplitter = "(.{2})(.*)(.{4})";
String input = "1234-5678-9012-345";
// The various splicers defined below can be used with the FirstTwoLastFourSplitter.
String [] splicer =
{
"(0 GLOBAL LENGTH=10 LC=PASS)(1 KEEP)(2 RANDOM)(3 KEEP)",
"(0 GLOBAL LC=PASS)(1 KEEP)(2 RANDOM)(3 KEEP)",
"(0 GLOBAL LENGTH=20 LC=PASS)(1 KEEP)(2 RANDOM)(3 KEEP)",
"(0 GLOBAL LENGTH=25 LC=PASS)(1 KEEP)(2 RANDOM)(3 KEEP)",
"(1 KEEP)(2 RANDOM)(3 KEEP)"
};
boolean [] lc = { true, false, true, true, false };
for(int i = 0; i < splicer.length; i++)
{
System.out.println("*** TEST " + i + " ***");
int formatId = _ts.createNewFormat("Format-" + i,
FirstTwoLastFourSplitter, splicer[i]);
String token = _ts.insert(input + i, _dbTable, formatId, lc[i]);
System.out.println("INPUT VALUE: " + input + i);
System.out.println("SPLITTER: " + FirstTwoLastFourSplitter);
System.out.println("SPLICER: " + splicer[i]);
System.out.println("LUHN CHECK: " + lc[i]);
System.out.println("REGEX TOKEN: " + token);
System.out.println("TOKEN LENGTH: " + token.length());
System.out.println("");
}
}
Note
If you use a Luhn Check, the input data must include at least 10 digits and cannot contain alpha characters.
Samples
FirstSixLastFourFailLuhnCheck1,(.{6})(.*)(.{4}),(0 LC=FAIL)(1 KEEP)(2 RANDOM DO- MAIN=N)(3 KEEP)
FixedFirstTwoLastFourFailLuhn1,(.{2})(.*)(.{4}),(0 LC=FAIL)(1 MASK VALUE=11)(2 RAN- DOM DOMAIN=N)(3 KEEP)
NewFormat2,(.{6})(.*)(.{4}),(1 KEEP)(2 RANDOM DOMAIN=N)(3 MASK VALUE=8888)
Note
Users of this method are responsible for applying this and other formatting functionality in a manner that enables them to test the results against applicable security criteria. There are many ways to apply CT-V methods to produce tokens, but they cannot guarantee that the results will pass your organization's requirements.
Create Custom Token Formats
Another method used to create custom token formats, the predecessor to the Regex version of createNewFormat(), is:
public int createNewFormat (int leadPositions, int trailPositions, String leadMask, int luhnCheck) throws TokenException
Or, if you want a token to be a different size than the input data:
public int createNewFormat (int leadPositions, int trailPositions,
String leadMask, int luhnCheck, int tokenLength) throws TokenException
Parameters:
leadPositions: number of positions to preserve (if any) on the left side of the value.
trailPositions: number of positions to preserve (if any) on the right side of the value.
leadMask: a fixed number of digits to mask the leading positions.
Note
leadPositions + trailPositions or leadMask + trailPositions cannot be larger than the token length; either the value set in the tokenLength parameter or the Token Vault’s token column. If this rule is ignored, an exception is thrown.
luhnCheck: indicates if the token should fail the check (-1), pass the check (1), or does not matter (0).
tokenLength: this parameter is optional. it indicates the length of the token if it is different from the input data. tokenLength cannot be larger than the Token Vault’s token column.
For example:
You can create a new format that masks the first 5 characters with 8’s, maintains the last three characters, and fails the Luhn check by writing the following, assuming ts is an instance of TokenService.
int yourNewFormat = ts.createNewFormat(0, 3, "88888", -1);
The following format keeps the first and last four characters, passes the Luhn check, and always returns a 10 digit token:
int anotherNewFormat = ts.createNewFormat(4, 4, null, 1, 10);
The method returns a format ID in the form of an integer. Your call to insert() must include this value. For example:
token = ts.insert(dataToEncrypt, TokenVault, yourNewFormat, false);
The SOAP Webservice API call is:
public Integer CreateNewTokenFormat (String naeUser, String naePassword, String dbUser, String dbPswd, Integer leadPositions, Integer trailPositions, String leadMask, Integer luhnCheck)
To create the new format shown above (that masks the first 5 characters with 8’s, maintains the last three characters, and fails the Luhn check), write the following, assuming SafeNetTokenizerStub is a Stub class based on the WSDL.
SafeNetTokenizerStub.CreateNewTokenFormatWithTokenLength custFormat = new SafeNetTokenizerStub.CreateNewTokenFormatWithTokenLength();
custFormat.setNaeUser("KeyManagerUser");
custFormat.setNaePassword("KeyManagerUserPassword");
custFormat.setDbUser("DatabaseUser");
custFormat.setDbPswd("DatabaseUserPassword");
custFormat.setLeadPositions (0);
custFormat.setTrailPositions (3);
custFormat.setLeadMask ("88888");
custFormat.setLuhnCheck (-1);
custFormat.setTokenLength (0);
SafeNetTokenizerStub.CreateNewTokenFormatWithTokenLengthResponse createResponse = stub.createNewTokenFormatWithTokenLength(custFormat);
int yourNewFormat = createResponse.get_return();
The method returns a format ID in the form of an integer. You’ll use this value when inserting a new token. For example:
SafeNetTokenizerStub.InsertTokenWithCustomData insertWithCustData = new SafeNetTokenizerStub.InsertTokenWithCustomData();
insertWithCustData.setNaeUser("KeyManagerUser");
insertWithCustData.setNaePassword("KeyManagerUserPassword");
insertWithCustData.setDbUser("DatabaseUser");
insertWithCustData.setDbPswd("DatabaseUserPassword");
insertWithCustData.setTableName("TOKENVAULTNAME");
insertWithCustData.setFormat( yourNewFormat );
insertWithCustData.setValue( "016443379888776560" );
insertWithCustData.setCustomData("999");
insertWithCustData.setLuhnCheck( false );
When creating new token formats, be aware of the following rules:
You can’t have both a lead position and a lead mask. When a lead mask is set, any lead position setting is ignored.
The lead position + the trail position or the lead mask + the trail position can’t be longer than the token value. If you are even close, you should seriously consider what you are doing. Given an input of 16 digits, if you are saving the first 6 and the last 8, realize that this is only tokenizing 2 digits. This is an extreme case, but shows the point.
If you use a Luhn Check (1 or -1), the input data must also include at least 10 digits and cannot contain alpha characters.
The CT-V throws an exception if the rules are ignored.
New formats are stored in the SFNT_TOKEN_FORMAT table in the database. There is one such table per database. In Oracle, the maximum number of token formats is 899, which is likely more formats than you will ever need. In SQL Server, you should likewise not exceed 899 formats. In either database, you probably won’t need to create more than ten, if any.
FORMAT | FORMATDESCR | LEADPOSITIONS | TRAILPOSITIONS | LEADMASK | LUHNCHECK | TOKENLENGTH |
---|---|---|---|---|---|---|
100 | 0 | 3 | 88888 | -1 | 0 | |
101 | 4 | 4 | 1 | 10 |
Note
The new custom token formats generated (100 and 101) are stored under the FORMAT column in the preceding table. These formats are to be used against the parameter format in the APIs when using the custom token formats.
Create a Token Generation Framework
For even more customization, you can define a token generation and validation mechanism. This mechanism is provided through a pluggable framework that has six features.
Customize Tokenization: create customized tokenization functions
Custom Tokenizer Function Interface: pass an instance of an ITokenGenerator implementation without creating the custom token format
Custom Token Validator: create custom token validation functions
Custom Token Validator Function Interface: pass an instance of a custom token validator implementation without creating the custom token validator function
Custom Token Generator with a Mask: pass your own custom token generator to create a token not stored in a vault
Custom Token Validator Function with Mask Using an Interface: pass an instance of a Token Validator Function with a Mask without creating the function
Customize Tokenization
Custom Tokenization allows users to create custom tokenization functions that define the way data is tokenized. This feature would be useful for a requirement to generate a token that resembles a card number in the format defined by the fiscal code (Codice Fiscale) in Italy.
There is an interface in the CT-V jar file.
public interface com.safenet.token.ITokenGenerator { public String generateToken( String value ) throws TokenException; }
Define a class that implements the interface.
package com.customer.code.tokenizers; class MyTokenizer implements com.safenet.token.ITokenGenerator { String generateToken( String value ) throws TokenException { // For simplicity, refers to customer tokenizer function. return tokenizeFunction( value ); } }
Install the code.
Put the compiled code in the classpath or the same directory as the CT-V installation. If the code is in the CT-V installation directory, make sure the directory structure of the package is present. The following example uses the package com.custom.code.tokenizers so the class file, MyTokenizer.class, would be in com\custom\code\tokenizers if CT-V is installed in the current directory.
Assign the custom token format.
TokenService ts = new TokenService(…); int format = ts.createNewFormat( "com.custom.code.tokenizers.MyTokenizer", null, 0); TmResult result = ts.insert(data, customData, dbTable, format, saveExceptions );
Caution
When assigning an implementation of com.safenet.token.ITokenGenerator to a format ID by calling createNewFormat, the last two parameters are not used. If they contain any values besides null, 0 or "", 0, createNewFormat throws an exception.
Custom Tokenizer Function Interface
You can pass an instance of an ITokenGenerator implementation without creating the custom token format.
Note
If you pass an instance without creating the custom token format, that method is not available when using the web service interface. When using a web service, use the procedure described above.
There is an interface in the CT-V jar file.
public interface com.safenet.token.ITokenizer { public String generateToken( String value ) throws TokenException; }
Define a class that implements the interface.
class MyTokenizer implements com.safenet.token.ITokenizer { @overrides String generateToken( String value ) throws TokenException { // For simplicity, refers to customer tokenizer function. return tokenizeFunction( value ); } }
Pass an instance of the implementation.
TokenService ts = new TokenService(…); com.safenet.token.ITokenizer myTokenizer = new MyTokenizer(); TmResult result = ts.insert(data, customData, dbTable, myTokenizer, saveExceptions );
Custom Token Validator
You can create custom token validation functions, even though CT-V still creates the token, as shown below.
There is an interface in the CT-V jar file.
public interface com.safenet.token.ITokenValidator { public Boolean isValid( String value ) throws TokenException; }
Define a class to validate the generated token.
package com.customer.code.validators; class MyValidator implements ITokenValidator { Boolean isValid( String value ) throws TokenException { return isTokenToMyLiking( value ); } }
Install the code.
Put the compiled code in the classpath or the same directory as the Tokenization installation. If the code is in the Tokenization installation directory, make sure the directory structure of the package is present. The following example uses the package com.custom.code.validators so the class file, MyTokenizer.class, would be in com\custom\code\validators if CT-V is installed in the current directory.
Assign the custom token format.
TokenService ts = new TokenService(…); int format = ts.createNewFormat(null,"com.customer.code.validators.MyValidator", TokenService.RANDOM_TOKEN ); TmResult result = ts.insert(data, customData, dbTable, format, false, saveExceptions );
Note
The last parameter value TokenService.RANDOM_TOKEN in createNewFormat is the token format for CT-V to use when generating tokens.
When assigning an implementation to a format ID by calling createNewFormat, the first parameter is not used. If it contain any values besides null or "" createNewFormat throws an exception.
To prevent CT-V from hanging, ITokenValidator.isValid(...) can return false 1000 times after which the insert or mask operation fails with an exception.
Custom Token Validator Function Interface
You can pass an instance of a custom token validator implementation without creating the custom token validator function.
There is an interface in the CT-V jar file.
public interface com.safenet.token.ITokenValidator { public Boolean isValid( String value ) throws TokenException; } Define a class that implements the interface. class MyValidator implements com.safenet.token.ITokenValidator; { @overrides Boolean isValid( String value ) throws TokenException { // For simplicity, refers to customers validator function. return isTokenToMyLiking( value ); } }
Pass an instance of the implementation.
com.safenet.token.ITokenValidator myValidator = new MyValidator(); // Tokenizing TmResult result = ts.insert( data, customData, dbTable, TokenService.RANDOM_TOKEN, false, myValidator, saveExceptions );
Note
In place of TokenService.RANDOM_TOKEN you can pass any format that is supplied with CT-V, or any custom format that is not a validator that was created with createNewFormat.
Custom Token Generator with a Mask
You can call TokenService.mask to pass your own custom token generator. The generated token by this call is not stored in a token vault.
There is an interface in the CT-V jar file. This is the same interface that is used for Customize Tokenization.
public interface com.safenet.token.ITokenGenerator { public String generateToken( String value ) throws TokenException; }
Define a class that implements the interface.
package com.customer.code.tokenizers; class MyTokenizer implements com.safenet.token.ITokenGenerator { @overrides String generateToken( String value ) throws TokenException { return tokenizeFunction( value ); } }
Install the code.
Put the compiled code in the classpath or the same directory as the Tokenization installation. If the code is in the Tokenization installation directory, make sure the directory structure of the package is present. The following example uses the package com.custom.code.tokenizers so the class file, MyTokenizer.class, would be in com\custom\code\tokenizers if CT-V is installed in the current directory.
Pass the custom token generator.
TokenService ts = new TokenService(…); int format = ts.createNewFormat("com.customer.code.tokenizers.MyTokenizer", null, 0); String[] tokens = ts.mask( data, format, null, false );
If desired, you can also define a detokenizer class.
package com.customer.code.tokenizers; class MyDetokenizer implements com.safenet.token.ITokenGenerator { @overrides String generateToken( String value ) throws TokenException { // For simplicity, refers to customer detokenizer function. return detokenizeFunction( value ); } }
Install the code.
Put the compiled code in the classpath or the same directory as the Tokenization installation. If the code is in the Tokenization installation directory, make sure the directory structure of the package is present. The following example uses the package com.customer.code.detokenizers so the class file, MyDetokenizer.class, would be in com\customer\code\tokenizers if CT-V is installed in the current directory.
Pass the custom detokenizer generator.
int format = ts.createNewFormat("com.customer.code.tokenizers.MyDetokenizer", null, 0); String[] data = ts.mask( tokens, format, null, false );
Custom Token Validator Function with Mask Using an Interface
You can pass an instance of a Token Validator Function with a Mask without creating the function.
There is an interface in the CT-V jar file. This is the same interface that is used for Customize Tokenization.
public interface com.safenet.token.ITokenGenerator { public String generateToken( String value ) throws TokenException; }
Define a class that implements the interface.
package com.customer.code.tokenizers; class MyTokenizer implements com.safenet.token.ITokenGenerator { @overrides String generateToken( String value ) throws TokenException { // For simplicity, refers to customer tokenizer function. return tokenizeFunction( value ); } }
Install the code.
Put the compiled code in the classpath or the same directory as the Tokenization installation. If the code is in the Tokenization installation directory, make sure the directory structure of the package is present. The following example uses the package com.customer.code.tokenizers so the class file, MyTokenizer.class, would be in com\customer\code\tokenizers if CT-V is installed in the current directory.
Pass an instance of the function.
TokenService ts = new TokenService(…); com.safenet.token.ITokenizer myTokenizer = new MyTokenizer(); String[] tokens = ts.mask( data, myTokenizer );