Another striking feature is that the field in the file can be arranged randomly with the AX field Name.It gives you independence to arrange the fields in any order in the file using the AX field names on the header. Also, there is a facility to extract a template and data, which appends a “$” to all mandatory fields. It can be imported as it is.
When using a bulk upload, it is sometimes difficult to analyze the info log as its limit is 10k lines. Here you have an error file and log file setting which stores the invalidated data. The error file has those records from the actual file that were not Inserted/Updated/Validated.
The Options available are:
Read - Import Mode
Write - Export Mode (to file)
File path options.
Import Mode - Validate only with out import ( no import takes place) , Insert new only, Insert new and update existing.
Entity - The csv import tool has many predefined scripts to take care of customer, vendor , address,employees, item master and COA. You just need to put the data in the particular format like in the item master the data has to be right skewed and the last three fields must be Prices.
The address import has a common field setting for customer, vendor and employees.
The csv import options form |
Validation Type
1) Validate Relation on Table
If you want to validate the data based on the parent table having field relation on Enum, then Validate relation on EDT is used.
2) Validate Relation on EDT
If you want to validate the data based on the Table relations then Validate relation on Table is used. This mode is not preferred while updating.The ref fields also get validated, so you need to ignore that error.
On selecting Generic entity, you get options to select the table.
Let us consider we have 3 records of customer to be migrated in a csv file.The payment Term has not come correct to the file(i.e. the data does not exist on paymterm table).
Customer 3 records. Red - error while validation |
The error log and the error records can also be found at the path mentioned in the error log parameters.You can change the data that is not imported from the error file of records and re import. The error file contains all the records that are not inserted due to some error.This is very useful as the wrong data is filtered effectively. The log file will copy the info log with the reference line number as shown below.
Code used to build dynamic query for validating EDT -
for (relationLine=1; relationLine <= linesCnt; relationLine++)
{
fieldId = dictRelation.lineExternTableValue(relationLine);
switch( dictRelation.lineType(relationLine) )
{
case TableRelation::Dataset :
case TableRelation::Field :
_queryBuildRange1 = _queryBuildDataSource.addRange(fieldId);
_queryBuildRange1.value( _Value );
break;
case TableRelation::ExternFixed :
_queryBuildRange2 = _queryBuildDataSource.addRange(dictRelation.lineExternTableValue(relationLine));
if( _array>0)
_queryBuildRange2.value(int2str(_array));
else
_queryBuildRange2.value(int2str(dictRelation.lineTableValue(relationLine)));
break;
}
}
Validate Table Realtion-
switch( dictRelation.lineType(relationLine) )
{
case TableRelation::Field :
thisFieldId = dictRelation.lineTableValue(relationLine);
printFieldId = thisFieldId;
if( (_commonRecord.(thisFieldId)) && ValidationRequired && thisFixedValidation )
{
_queryBuildDataSource.addRange(ExtfieldId).value( _commonRecord.(thisFieldId) );
}
else
{
ValidationRequired = False;
}
break;
case TableRelation::ThisFixed :
thisFieldId = dictRelation.lineTableValue(relationLine);
if( _commonRecord.(thisFieldId) && (_commonRecord.(thisFieldId)!= ExtfieldId && ValidationRequired ))
{
thisFixedValidation = False;
}
break;
case TableRelation::ExternFixed :
if( ValidationRequired)
_queryBuildDataSource.addRange(dictRelation.lineExternTableValue(relationLine)).value(int2str(dictRelation.lineTableValue(relationLine)));
break;
Code used to build dynamic query for validating EDT -
for (relationLine=1; relationLine <= linesCnt; relationLine++)
{
fieldId = dictRelation.lineExternTableValue(relationLine);
switch( dictRelation.lineType(relationLine) )
{
case TableRelation::Dataset :
case TableRelation::Field :
_queryBuildRange1 = _queryBuildDataSource.addRange(fieldId);
_queryBuildRange1.value( _Value );
break;
case TableRelation::ExternFixed :
_queryBuildRange2 = _queryBuildDataSource.addRange(dictRelation.lineExternTableValue(relationLine));
if( _array>0)
_queryBuildRange2.value(int2str(_array));
else
_queryBuildRange2.value(int2str(dictRelation.lineTableValue(relationLine)));
break;
}
}
Validate Table Realtion-
switch( dictRelation.lineType(relationLine) )
{
case TableRelation::Field :
thisFieldId = dictRelation.lineTableValue(relationLine);
printFieldId = thisFieldId;
if( (_commonRecord.(thisFieldId)) && ValidationRequired && thisFixedValidation )
{
_queryBuildDataSource.addRange(ExtfieldId).value( _commonRecord.(thisFieldId) );
}
else
{
ValidationRequired = False;
}
break;
case TableRelation::ThisFixed :
thisFieldId = dictRelation.lineTableValue(relationLine);
if( _commonRecord.(thisFieldId) && (_commonRecord.(thisFieldId)!= ExtfieldId && ValidationRequired ))
{
thisFixedValidation = False;
}
break;
case TableRelation::ExternFixed :
if( ValidationRequired)
_queryBuildDataSource.addRange(dictRelation.lineExternTableValue(relationLine)).value(int2str(dictRelation.lineTableValue(relationLine)));
break;
hi, do you have the xpo of this script??
ReplyDelete