1550 - 1600 of 2500 tags for Tools


we have a daily multiload insert job and a few critical jobs depend on the successful completion of it. This multiload often failed due to data issue e.g. unexpected input charactor appears same as delimiter. Currently we have difficulty to filter the source data, so I wonder if in multiload script, there is a way to "bypass" the offending records and complete multiload with lock released and return code as 0 or something similar.

I understand INMOD can filter data before multiload, but have no knowledge on C.

We are runing TD8.2, this mutiload job runs from MVS


Hi,I'm new to Teradata and trying to write simple program using CLIv2 (Teradata version 12). Using the sample cli code provided by Teradata I could able to connect and execute the statements like CREATE TABLE, INSERT INTO.. But when I want to fetch the result set of a query SELECT * from SAMPLES.table1 I'm facing problem in coding. I dont know how to get the records from the query. Could any one help me in coding this. According to me after executing the query using DBCHCL(&result,cnta,&dbcarea), the result will be stored in dbcarea.fet_data_ptr; I dont know for which structure I have to type cast this, and also do I need to take care of the Data Types returned by the result set. In the Teradata sample program, it is reading character by character and storing in a file. I want to how to read a Database column. I sample example for fetching the records will be very helpful me.RegardsShrihari

Openning SQL Scrapbook triggered the following error:

Eclipse: 20090621-0832
Data Tools: 1.6.2
TD Plugin :

ava.lang.NoClassDefFoundError: org/eclipse/datatools/sqltools/routineeditor/ProcEditorInput
at com.teradata.datatools.dtp.sqltools.db.teradata.TeradataExecuteMultiSQLActionExtension.contributeToContextMenu(TeradataExecuteMultiSQLActionExtension.java:53)
at org.eclipse.datatools.sqltools.sqleditor.SQLEditor.fillContextMenu(SQLEditor.java:1258)


I need some help regarding an error that occurred while our production loading was running. Here is the error message:
Access module error '34' received during 'read' operation on record
number '1361905': 'pmunxRBuf: fread error (Invalid argument)'

A fast look at the record everything looked fine. So, I removed the record from the input file (saving it for analysis) and reran the MLoad. It ran cleanly.

Hi,I'm new to Teradata. I have a code which was written using Preprocessor2 long time back (may be using older version of Teradata 8 years back). Now I want to reuse the code and compile the same on windows. When I go through with "Teradata Preprocessor2 for Embedded SQL Programmer Guide Release 09.01.00 B035-2446-115A November 2005" I do get a section how to use pp2win.exe. But when Teradata express edition is installed I dont get pp2win.exe in "Teradata client\bin" folder.Could anyone please let me know how to compile the *.pp2c file. What all the options I need to set in under visual studio.I check the sample code for CLIv2 and using the sample code I'm able to connect to Teradata database. Actually I dont want to rewrite the Preprocessor2 code to CLIv2, because this will take long time. And I have to deliver a PoC in 2 weeks of time. Hence this is high priority for me.Advance thanks for your help

I need to COLLECT STATS after an Update Statement.BT; Update Table ; COLLECT STATS ; Delete;ET;I've to use BT; ET; as a Multi-statement request in my BTEQ scripts.But I'm getting Error - as DDL statement (COLLECT STATS) can't allowed in Multi-statement request.Also DDL statement (COLLECT STATS) has to be the last statement of the script.How should I use COLLECT STATS inside a Multi-statement request using BTET?Also the below process throwing the error.BT;Update....;ET;BT;COLLECT STATS...;ET;BT;DELETE....;ET;Help please!!!!!!!!!

We tried to use Oracle transparent gateway, we succesfully tried run query from oracle to Teradata. The purpose is to run joint query from teradata and oracle tables/views. For the performance issue, We haven't tested to run the join query with large tables.My questions are:1.

Hi,Can anyone help me in loading teradata tables with DB2 tables without using Informatica. Is there any way by using loaders or BTEQ's.Thanks,Rakesh

HiCan anyone send me the procedure for loading the data from flat files(VSAM, txt, csv etc) to teradata tables without using Informatica. thanks,Rakesh

I'm having problems running Fastload from a Linux client where there's a variable length field present.

I want to copy a selection of data from the production server to the test one, I've tried running the Export using both FASTLOAD & TEXT formats, and then a Fastload using both FORMATTED & TEXT formats but it objects to the variable data every time.

The error message given is -

**** 16:22:07 Bad file or data definition.
**** 16:22:07 The length of: COLUMN_A in row: 1 was greater than
Defined: 250, Received: 12336

we have many tables with secondary indexes and while loading it's giving error. we have dropped the secondary indexes and loaded the data. Is it possible to include the below statement in MLOAD script?

create INDEX gmrdate_usi_mo_nm (mo_mm) on teradbc.gmrdate.

Please give me the solution as soon as possible.

HiWhere can i find information regarding Archiving and Purging in Teradata.ThanksSreehari

HiWhere can i find info(PDF) on Datamover Tool of Terdata.Can it be invoked from Teradata Demo Version just like the other utilities(Multiload etc)Thanks Sreehari

Where can i find details about DATAMOVER TOOL in Teradata.


i'm running into this error while writing to a file in TPT:

STR_EMI_DET_SQL: sending SELECT request
STR_EMI_DET_SQL: retrieving data
PXTBS_PutRow: Invalid input row length, status = Length Error
STR_EMI_DET_SQL: Error 29 in putting a record onto the data stream
Operator(libselectop.so) instance(1): EXECUTE method failed with status = Length Error
STR_EMI_DET_SQL: disconnecting sessions
STR_EMI_DET_FILE_APPEND: Total files processed: 0.
Job step S2 terminated (status 8)
Job dcole terminated (status 8)

I have installed Teradata Express. For some reason that I can figure out, I can't log into Teradata Manager. It tells me "Invalid Logon String".I'm using localtdcop1 (tried localtd, localhost) in the dbsdbc in user and pwd.I can use everything else except bteq.I've starting teradata, installed the eclipse plugin.the other issue I am running into is when I try to create a udf, it tells me (dbc) that I don't have permission. I'm pretty new to Teradata, but I figure I need to grant permissions through the Teradata Manager.I tried customer support, but they don't seem to return messages.

Hi,Does anyone have any code examples of how to do positioned update and delete in cliv2 please?

Hi, We are trying to migrate our TD database to Ooacle DB.I read from many sites that TD parellelTransporter is a very good utility to transport data across different databases.If you have any idea about how to use this tool/utility kindly expalin.I am struggling to get an opening with this tool.Your help is highly appreciated.ThanksShinesh

Hi All,We are aware that Tpump gives control over the rate per minute at which statements are sent to the RDBMS. This can be done either dynamically or with a predefined statement in the script.Can any of you let me know this can be done dynamically ??Can you give me a sample script of how it has to be defined in the script ?

Hello,Can MLoad handle data that is in Scientific notation? example : 8.56055E10If so, how would you could for such a value?Thanks!Paul

Has anyone observed this SQL Assistant issue -- it has been reported sporadically by different users at our workplace, and I have also personally seen this in the past. When trying to save a file in SQL Assistant, you receive an error message -- The process cannot access the file because it is being used by another process (screen shot attached.) Multiple versions of SQL Assistant could be open however, they are not necessarily using the same .SQL file. Also, there is no other user or process that is accessing the file which cannot be saved. I even asked the reporting user to start saving to her C drive to ensure no other process or user could be accessing the file. She still received the error message. The user now has only one version of SQL Assistant open and this error is still occurring. We cannot determine what could be locking the file.

Hi All,As Fastload does not allow duplicate records. But how to identify the duplicate records, those records were not loaded in errortables.Only on the output screen we can see.Please let me know any ideas.Regards,kiran

Hi All,Can we use Conditional Statements(.IF THEN like in bteq) in Fastload script. If there any way to exit the bteq if we get any error in the sql statements used in bteq.

hi ppl, I would like to know how fastload identifies duplicates while loading ??? Regards, Rock

I exported some byte data from DB2 into a flatfile (such x'00' x'20' x'08' ....). I tried in fixed length and delimited formats, but I was not able to load into a Teradata byte column using MLOAD. The byte data when exported to these files is a sequence of 13 characters. I used a delimiter which would not conflict with any of the byte data characters. For the fixed length TEXT file, I defined the field as BYTE(13) in the layout, same size as the target table. When run, I get a error such as data in the file invalid for the byte colummn. For the vartext delimited file, I defined the field as VARBYTE(13) in the layout, and I get an error such as "data is too short". It seems like MLOAD can't break up the record correctly. Is the problem related to the non-standard text characters in the file? Should I use the unformatted format?Thanks

Hi guys!I need you use a newer .Net Data Provider to Teradata('ve installed these items:* Teradata .NET Data Provider Installer* README * Teradata .NET Data Provider Help* Release Definition and Install Instructions These items are contained in this link:http://developer.teradata.com/download/connectivity/teradata-net-data-provider-13-0-0-1#Is this enoguh or I have to install another item?When I have to use Teradata, I have to initialize Teradata Service Control and this is still in 12.0 version.Could help me to run the version?Thanks for help,Anderson

Hi guys,I really need your help.Here is fexp script am using to export 4 columns data in fastload mode..LOGTABLE RETAIL.FEXP_LOG;.LOGON ***/******;.BEGIN EXPORT;.EXPORT OUTFILE C:\emp1.txt MODE RECORD FORMAT fastload;SELECT trim(EmpNo (VARCHAR(15))), trim(Name (VARCHAR(18))), trim(DeptNo (SMALLINT)), trim(Salary (DECIMAL(8,2)))FROM RETAIL.EMP;.END EXPORT;.LOGOFF;and this same exported data i m fastloading to a table....here is my fastload script.LOGON ***/******;DROP TABLE RETAIL.EMP2;DROP TABLE RETAIL.ET_EMP2;DROP TABLE RETAIL.UV_EMP2;CT RETAIL.EMP2 ( EMPNO VARCHAR(15), NAME VARCHAR(18), DEPTNO SMALLINT, SALARY DECIMAL(8,2) ) PRIMARY INDEX ( EMPNO );DEFINE EmpNo (VARCHAR(15)), Name (VARCHAR(18)), DeptNo (SMALLINT), Salary (DECIMAL(8,2)) FILE = c:\emp1.txt;BEGIN LOADING retail.emp2 ERRORFILES RETAIL.ET_EMP2,RETAIL.UV_EMP2 CHECKPOINT 100;INSERT INTO retail.emp2 VALUES(:EmpNo,:Name,:DeptNo,:Salary); END LOADING; LOGOFF;while executing fastload script i am getting following error..--Record is too long by 10 byte(s).-- Possible case : Variable size field may not contain 2-byte or 2-byte length may be invalid.but how an error can come coz both table has same ddl ,everything same.Plz suggest something...thanks and Regards,kapil

Hi,Can someone explain to me this code: .SET DBASE_TARGETTABLE TO 'fexp_usr'; /*This I underwstand */.SET TARGETTABLE TO i999839999;/*This I underwstand */.BEGIN IMPORT MLOAD /*This I underwstand *//*This I DONT underwstand */TABLES &DBASE_TARGETTABLE..&TARGETTABLEdoes it mean like in oracle from first ...to lastor the tablename??Rgds,Raja

Hi, I was trying to start Teradata using Teradata Service control but its saying that "Bynet Installed, but not running. Reboot to start".I tried to reboot my system and even after that I am getting the same error message.Any Idea of how to get rid of this or Are there any other ways to start Teradata ?Thanks in advance.

I'm not going to be using Teradata much, but I may need to browse a particular Teradata database. What tool do I need to install (MS Windows XP) so I can do that? I've heard of "SQL Assistant", but I just can't find it anywhere.

Are there any tools in Teradata which does the same work as SQL server profiler ??

Hi, How many ways we can migrate the objects from one database to another database Thanksyoga

1) Why there is no journalling or fallback in Fastload?2) In fastload, I cannot have USI or NUSI right? I know I can drop them and recreate after loading. But can anyone explain clearlywhy I cannot have SI in fatsload?---Is it because I cannot have duplicate rows...mm?3) How can I get the value of FLOADLIB environment variable?4) Can I have more than 7 parameters in cfg file in fastload?5) Provide the syntax for converting ANSI/SQL DateTime data types to fixed-length CHAR before loading6) A child table referencing the master table is to be fastloaded? If I execute and run the fastload to load this child table will the job run and reject the unmatched record?7) Why cant we have PACK for fastload too?

Please see this topic in the General forum for an introduction to the new discussion forums, and a chance to give feedback.

Anyone know how to set the account string in TPT for the DDL and LOAD operators? Normally this is the 3rd parameter in the LOGIN statement in BTEQ. Is this the same as the AccountID in those operators?

Hi,I am srinivas and this is my first post( and its a question ofcourse) in the forum.I am using teradata sql assistant 12.0 . and i have a problem with the query window.Eventhough i am selecting(highlighitng) an individual query(and presspressingto execute...

In the table DBC.TDWMsummaryLog the WDID column, is there a table I can link to to get a more meaningful description? or even more information about it?

Can anybody please tell me the syntax for granting Create Table access to any database ??Thanks !!!

Hi All,I have a requirement in which I need to pass the same DATE parameter to a set of macros.I want to incorporate all the macros in one bteq script so that the user can just specify the parameter value and all the macros are executed at one go.Can anybody please tell me how can I accomplish this using Bteq?It would be better if you could provide me with a sample script for it.I could have created one macro and have included all the SQL statements in it in sequence order and have simply executed the macro specifying the date parameter.But I have few Collect Stats statements in between.I have learnt that we cannot specify more than 1 collect stats in a single macro thats the reason I am switching towards Bteq.Thanks for your quick response.

I am seeing this weird problem - my java program reads form a source and creates TPump scripts and dat files on the fly. Everything works fine on windows, but recently on AIX I am seeing this problem What it is showing as "invalid statement" before .LOGON, is something which is not there in the actual script that I generate. In fact the generated script is archived and when run manually, it runs fine without any change!!!this invalid statement marked 0001 is infact part of last statement in the .IMPORT command in same script. ---- TPUMP LOG ------0001 LTYPE=10 OR DMLTYPE=11 OR DMLTYPE=12 OR DMLTYPE=13 ;**** 00:32:46 UTY2403 An invalid statement was found before the .LOGON statement.**** 00:32:46 UTY2410 Total processor time used = '0.016644 Seconds'Any idea what could be causing what looks like a potential memory overrun.

Hi All,Why a Parsing Engine(PE) and Access Module Processor(AMP) called as Virtual Processors.REgards,kiran

Hi,I am using a Java program to write dat ( binary ) file to be loaded by TPump script.For "Decimal" column type, here is how I write the data:aValue = ((BigDecimal) aValue).movePointRight(((BigDecimal) aValue).scale());aValue = ((BigDecimal) aValue).toBigInteger();byte[] integerBytes = aValue.toByteArray(); int paddingLength = len - integerBytes.length; // integerBytes always in big endian (see BigInteger.toByteArray()) if (!BIG_ENDIAN) { for (int i=(integerBytes.length - 1); i >= 0; i--) { dataBuf[dataOff++] = integerBytes; } } else { System.arraycopy(integerBytes, 0, dataBuf, dataOff, integerBytes.length); dataOff += integerBytes.length; }... do the padding and write buffer to fileNow the problem is For a column type Decimal(8,2) I am getting 2683 error - but only when I run on AIX, with TPump failing to insert afert first few records and ERROR table shows 2683 error, whereas on windows it works fine.Forgot to add that this is due to difference in ByteOrder ( windows is LittleEndian and AIX is BigEndian)How can I modify above to work consistently, w.o affecting other data types.Thanks.

Hi ,Which tool we can use to do performance tuning in Teradata.As PMON is used to just view the query and to check the status of that query.Regards,kiran

Hi,Can any one please tell me the differences between a Macro and Stored procedue as both contains set of sql statements.Regards,kiran

Error occured while running following script :Error :incorrect number of bytes returned from a File ReadExpected 72,Received : 33Script Source :Chapter 1: IntroductionA FastLoad ExampleTeradata FastLoad Reference 1 – 35Error occured while running following script :sessions 2;errlimit 25;logon tdpid/username,password;CREATE TABLE employee (EmpNo SMALLINT FORMAT ‘9(5)’ BETWEEN 10001 AND 32001 NOT NULL,Name VARCHAR(12),DeptNo SMALLINT FORMAT ‘999’ BETWEEN 100 AND 900 ,PhoneNo SMALLINT FORMAT ‘9999’ BETWEEN 1000 AND 9999,JobTitle VARCHAR(12),Salary DECIMAL(8,2) FORMAT ‘ZZZ,ZZ9.99’ BETWEEN 1.00 AND 999000.00 ,YrsExp BYTEINT FORMAT ‘Z9’ BETWEEN -99 AND 99 ,DOB DATE FORMAT ‘MMMbDDbYYYY’,Sex CHAR(1) UPPERCASE,Race CHAR(1) UPPERCASE,MStat CHAR(1) UPPERCASE,EdLev BYTEINT FORMAT ‘Z9’ BETWEEN 0 AND 22,HCap BYTEINT FORMAT ‘Z9’ BETWEEN -99 AND 99 )UNIQUE PRIMARY INDEX( EmpNo ) ;set record unformatted;definedelim0(char(1)),EmpNo(char(9)), delim1(char(1)),Name(char(12)), delim2(char(1)),DeptNo(char(3)), delim3(char(1)),PhoneNo(char(4)), delim4(char(1)),JobTitle(char(12)), delim5(char(1)),Salary(char(9)), delim6(char(1)),YrsExp(char(2)), delim7(char(1)),DOB(char(11)), delim8(char(1)),Sex(char(1)), delim9(char(1)),Race(char(1)), delim10(char(1)),MStat(char(1)), delim11(char(1)),EdLev(char(2)), delim12(char(1)),HCap(char(2)), delim13(char(1)),newlinechar(char(1))file=insert.input;show;begin loading employee errorfiles error_1, error_2;insert into employee (:EmpNo,:Name,:DeptNo,:PhoneNo,:JobTitle,:Salary,:YrsExp,:DOB,:Sex,:Race,:MStat,:EdLev,:HCap);end loading;logoff;Comments Appreciated

I have some issues with TD Export command. The SQL I use in export command is SELECT BUSINESS_UNIT ,' | ',EFFDT ,' | ',EFF_STATUS FROM ABC The output of this is spooled to a file called ABC.txt. $ head ABC.txt01454 | 06/0 | A | 01731 | 06/0 | A | G0360 | 06/0 | A | 01700 | 06/0 | A | 01810 | 06/0 | A | 01500 | 06/0 | A | When I run the query in query man, I get correct data. Result set attached below01454 | 01/01/2006 | A |01731 | 01/01/2006 | A | G0360 | 01/01/2006 | A | 01700 | 01/01/2006 | A | 01500 | 01/01/2006 | A | Interestingly, when I check DBC.COLUMNS, I find that this date field is has a column width 4.ColumnName ColumnType ColumnLengthBUSINESS_UNIT CV 5EFFDT DA 4EFF_STATUS CV 1Code Snippet of the BTEQ script as follows/****************************************************************************************Set the export file as table name with extension as .txt****************************************************************************************/.SET FULLYEAR ON;.EXPORT DATA FILE=$TABLE_NAME.txt/****************************************************************************************Run the select statement in the file created by shell script****************************************************************************************/. run file $COL_FILE_NAME.SET FORMAT OFF;.EXPORT RESET

Hi Guys,I am using an export command in a BTEQ script. The output file needs to be formatted to get the a comma, pipe(|) or ~ delimited file. Can you guys please help me in this regard? The SQL Statement that I am passing to do this Export command is dynamic, hence can not be pre appended with any of the delimiters. The Code snippet would look like this.EXPORT DATA FILE=$TABLE_NAME.txt/**************************************************************************************************Run the select statement in the file created by shell script**************************************************************************************************/. run file $COL_FILE_NAMEContent of the $COL_FILE_NAME is something that would change dynamically during execution

We're using SQL Server Enterprise Manager 8.0 to schedule multiple step SQL Assistance scripts and WINSCP file transfers (with dependencies - thus we are using SQL Server rather than Teradata Query Scheduler). When I run under my own loginID, the scripts use the file delimiters (pipe '|') that are set in tools>options. But the jobs are being executed in SQL Server by a SQL Server manager and it is using tab delimiters. We cannot login as the SQL Server manager and change the SQL Assistance defaults for that login.Has anyone changed the SQL Assistance registry for ExportDelim={tab} directly? How? Where?

TD SQL Assistant menu option? Tools ? List Columns not working properly with wildcard %.Teradata SQL Assistant -> “Tools” and then “List Columns”, database name of “XXXX” and table/view name of “%”. failed with "The Database or Table is invalid or unavailable"If I use “C%” as the tablename in the Teradata SQL Assistant Tool it will work, pulling all of the columns in all of the tables beginning with “C”. It just doesn’t work if I try to pull all of the tables “with just “%”.for some database its working fine with wildcard "table or view name" of “%”. fyi, having full access on database

Hello,I am just getting familiarized with TPT and I was trying to load a table. I am using demo version 12. Everything seems to be fine with my script and everything else, but in job state it says: completed with unexpected status from tbuild (2)Can anyone assist me with this?Thanks,Azeem Syedbelow is the script:/* 1 */ /* 2 */ /* 3 */ DEFINE JOB Movie/* 4 */ DESCRIPTION 'Loading movie table'/* 5 */ (/* 6 */ DEFINE OPERATOR W_1_o_Movie/* 7 */ TYPE LOAD/* 8 */ SCHEMA */* 9 */ ATTRIBUTES/* 10 */ (/* 11 */ VARCHAR UserName, /* 12 */ VARCHAR UserPassword, /* 13 */ VARCHAR LogTable, /* 14 */ VARCHAR TargetTable, /* 15 */ INTEGER BufferSize, /* 16 */ INTEGER ErrorLimit, /* 17 */ INTEGER MaxSessions, /* 18 */ INTEGER MinSessions, /* 19 */ INTEGER TenacityHours, /* 20 */ INTEGER TenacitySleep, /* 21 */ VARCHAR AccountID, /* 22 */ VARCHAR DateForm, /* 23 */ VARCHAR ErrorTable1, /* 24 */ VARCHAR ErrorTable2, /* 25 */ VARCHAR NotifyExit, /* 26 */ VARCHAR NotifyExitIsDLL, /* 27 */ VARCHAR NotifyLevel, /* 28 */ VARCHAR NotifyMethod, /* 29 */ VARCHAR NotifyString, /* 30 */ VARCHAR PauseAcq, /* 31 */ VARCHAR PrivateLogName, /* 32 */ VARCHAR TdpId, /* 33 */ VARCHAR TraceLevel, /* 34 */ VARCHAR WorkingDatabase/* 35 */ );/* 36 */ /* 37 */ DEFINE SCHEMA W_0_s_Movie/* 38 */ (/* 39 */ MID INTEGER,/* 40 */ NAME_2 VARCHAR(100),/* 41 */ ACTOR VARCHAR(100),/* 42 */ GENERE VARCHAR(50),/* 43 */ RATING VARCHAR(20),/* 44 */ RELEASE_YEAR SMALLINT/* 45 */ );/* 46 */ /* 47 */ DEFINE OPERATOR W_0_o_Movie/* 48 */ TYPE EXPORT/* 49 */ SCHEMA W_0_s_Movie/* 50 */ ATTRIBUTES/* 51 */ (/* 52 */ VARCHAR UserName, /* 53 */ VARCHAR UserPassword, /* 54 */ VARCHAR SelectStmt, /* 55 */ INTEGER BlockSize, /* 56 */ INTEGER MaxSessions, /* 57 */ INTEGER MinSessions, /* 58 */ INTEGER TenacityHours, /* 59 */ INTEGER TenacitySleep, /* 60 */ INTEGER MaxDecimalDigits, /* 61 */ VARCHAR AccountID, /* 62 */ VARCHAR DateForm, /* 63 */ VARCHAR NotifyExit, /* 64 */ VARCHAR NotifyExitIsDLL, /* 65 */ VARCHAR NotifyLevel, /* 66 */ VARCHAR NotifyMethod, /* 67 */ VARCHAR NotifyString, /* 68 */ VARCHAR PrivateLogName, /* 69 */ VARCHAR TdpId, /* 70 */ VARCHAR TraceLevel, /* 71 */ VARCHAR WorkingDatabase/* 72 */ );/* 73 */ /* 74 */ APPLY/* 75 */ (/* 76 */ 'INSERT INTO VIDEO_OPERATION.movie_stg (MID,NAME,ACTOR,GENERE,RATING,RELEASE_YEAR) VALUES (:MID,:NAME_2,:ACTOR,:GENERE,:RATING,:RELEASE_YEAR);'/* 77 */ )/* 78 */ TO OPERATOR/* 79 */ (/* 80 */ W_1_o_Movie[1]/* 81 */ /* 82 */ ATTRIBUTES/* 83 */ (/* 84 */ UserName = 'tduser', /* 85 */ UserPassword = 'tduser', /* 86 */ LogTable = 'VIDEO_OPERATION.movie_stg_log', /* 87 */ TargetTable = 'VIDEO_OPERATION.movie_stg', /* 88 */ TdpId = 'localtd'/* 89 */ )/* 90 */ )/* 91 */ SELECT * FROM OPERATOR/* 92 */ (/* 93 */ W_0_o_Movie[1]/* 94 */ /* 95 */ ATTRIBUTES/* 96 */ (/* 97 */ UserName = @TeradataSourceUser, /* 98 */ UserPassword = @TeradataSourcePassword, /* 99 */ SelectStmt = 'SELECT MID,NAME,ACTOR,GENERE,RATING,RELEASE_YEAR FROM VIDEO_OPERATION.movie2;', /* 100 */ TdpId = 'localtd'/* 101 */ )/* 102 */ );/* 103 */ );