1550 - 1600 of 2462 tags for Tools

Hi All,I have a requirement in which I need to pass the same DATE parameter to a set of macros.I want to incorporate all the macros in one bteq script so that the user can just specify the parameter value and all the macros are executed at one go.Can anybody please tell me how can I accomplish this using Bteq?It would be better if you could provide me with a sample script for it.I could have created one macro and have included all the SQL statements in it in sequence order and have simply executed the macro specifying the date parameter.But I have few Collect Stats statements in between.I have learnt that we cannot specify more than 1 collect stats in a single macro thats the reason I am switching towards Bteq.Thanks for your quick response.

I am seeing this weird problem - my java program reads form a source and creates TPump scripts and dat files on the fly. Everything works fine on windows, but recently on AIX I am seeing this problem What it is showing as "invalid statement" before .LOGON, is something which is not there in the actual script that I generate. In fact the generated script is archived and when run manually, it runs fine without any change!!!this invalid statement marked 0001 is infact part of last statement in the .IMPORT command in same script. ---- TPUMP LOG ------0001 LTYPE=10 OR DMLTYPE=11 OR DMLTYPE=12 OR DMLTYPE=13 ;**** 00:32:46 UTY2403 An invalid statement was found before the .LOGON statement.**** 00:32:46 UTY2410 Total processor time used = '0.016644 Seconds'Any idea what could be causing what looks like a potential memory overrun.

Hi All,Why a Parsing Engine(PE) and Access Module Processor(AMP) called as Virtual Processors.REgards,kiran

Hi,I am using a Java program to write dat ( binary ) file to be loaded by TPump script.For "Decimal" column type, here is how I write the data:aValue = ((BigDecimal) aValue).movePointRight(((BigDecimal) aValue).scale());aValue = ((BigDecimal) aValue).toBigInteger();byte[] integerBytes = aValue.toByteArray(); int paddingLength = len - integerBytes.length; // integerBytes always in big endian (see BigInteger.toByteArray()) if (!BIG_ENDIAN) { for (int i=(integerBytes.length - 1); i >= 0; i--) { dataBuf[dataOff++] = integerBytes; } } else { System.arraycopy(integerBytes, 0, dataBuf, dataOff, integerBytes.length); dataOff += integerBytes.length; }... do the padding and write buffer to fileNow the problem is For a column type Decimal(8,2) I am getting 2683 error - but only when I run on AIX, with TPump failing to insert afert first few records and ERROR table shows 2683 error, whereas on windows it works fine.Forgot to add that this is due to difference in ByteOrder ( windows is LittleEndian and AIX is BigEndian)How can I modify above to work consistently, w.o affecting other data types.Thanks.

Hi ,Which tool we can use to do performance tuning in Teradata.As PMON is used to just view the query and to check the status of that query.Regards,kiran

Hi,Can any one please tell me the differences between a Macro and Stored procedue as both contains set of sql statements.Regards,kiran

Error occured while running following script :Error :incorrect number of bytes returned from a File ReadExpected 72,Received : 33Script Source :Chapter 1: IntroductionA FastLoad ExampleTeradata FastLoad Reference 1 – 35Error occured while running following script :sessions 2;errlimit 25;logon tdpid/username,password;CREATE TABLE employee (EmpNo SMALLINT FORMAT ‘9(5)’ BETWEEN 10001 AND 32001 NOT NULL,Name VARCHAR(12),DeptNo SMALLINT FORMAT ‘999’ BETWEEN 100 AND 900 ,PhoneNo SMALLINT FORMAT ‘9999’ BETWEEN 1000 AND 9999,JobTitle VARCHAR(12),Salary DECIMAL(8,2) FORMAT ‘ZZZ,ZZ9.99’ BETWEEN 1.00 AND 999000.00 ,YrsExp BYTEINT FORMAT ‘Z9’ BETWEEN -99 AND 99 ,DOB DATE FORMAT ‘MMMbDDbYYYY’,Sex CHAR(1) UPPERCASE,Race CHAR(1) UPPERCASE,MStat CHAR(1) UPPERCASE,EdLev BYTEINT FORMAT ‘Z9’ BETWEEN 0 AND 22,HCap BYTEINT FORMAT ‘Z9’ BETWEEN -99 AND 99 )UNIQUE PRIMARY INDEX( EmpNo ) ;set record unformatted;definedelim0(char(1)),EmpNo(char(9)), delim1(char(1)),Name(char(12)), delim2(char(1)),DeptNo(char(3)), delim3(char(1)),PhoneNo(char(4)), delim4(char(1)),JobTitle(char(12)), delim5(char(1)),Salary(char(9)), delim6(char(1)),YrsExp(char(2)), delim7(char(1)),DOB(char(11)), delim8(char(1)),Sex(char(1)), delim9(char(1)),Race(char(1)), delim10(char(1)),MStat(char(1)), delim11(char(1)),EdLev(char(2)), delim12(char(1)),HCap(char(2)), delim13(char(1)),newlinechar(char(1))file=insert.input;show;begin loading employee errorfiles error_1, error_2;insert into employee (:EmpNo,:Name,:DeptNo,:PhoneNo,:JobTitle,:Salary,:YrsExp,:DOB,:Sex,:Race,:MStat,:EdLev,:HCap);end loading;logoff;Comments Appreciated

I have some issues with TD Export command. The SQL I use in export command is SELECT BUSINESS_UNIT ,' | ',EFFDT ,' | ',EFF_STATUS FROM ABC The output of this is spooled to a file called ABC.txt. $ head ABC.txt01454 | 06/0 | A | 01731 | 06/0 | A | G0360 | 06/0 | A | 01700 | 06/0 | A | 01810 | 06/0 | A | 01500 | 06/0 | A | When I run the query in query man, I get correct data. Result set attached below01454 | 01/01/2006 | A |01731 | 01/01/2006 | A | G0360 | 01/01/2006 | A | 01700 | 01/01/2006 | A | 01500 | 01/01/2006 | A | Interestingly, when I check DBC.COLUMNS, I find that this date field is has a column width 4.ColumnName ColumnType ColumnLengthBUSINESS_UNIT CV 5EFFDT DA 4EFF_STATUS CV 1Code Snippet of the BTEQ script as follows/****************************************************************************************Set the export file as table name with extension as .txt****************************************************************************************/.SET FULLYEAR ON;.EXPORT DATA FILE=$TABLE_NAME.txt/****************************************************************************************Run the select statement in the file created by shell script****************************************************************************************/. run file $COL_FILE_NAME.SET FORMAT OFF;.EXPORT RESET

Hi Guys,I am using an export command in a BTEQ script. The output file needs to be formatted to get the a comma, pipe(|) or ~ delimited file. Can you guys please help me in this regard? The SQL Statement that I am passing to do this Export command is dynamic, hence can not be pre appended with any of the delimiters. The Code snippet would look like this.EXPORT DATA FILE=$TABLE_NAME.txt/**************************************************************************************************Run the select statement in the file created by shell script**************************************************************************************************/. run file $COL_FILE_NAMEContent of the $COL_FILE_NAME is something that would change dynamically during execution

We're using SQL Server Enterprise Manager 8.0 to schedule multiple step SQL Assistance scripts and WINSCP file transfers (with dependencies - thus we are using SQL Server rather than Teradata Query Scheduler). When I run under my own loginID, the scripts use the file delimiters (pipe '|') that are set in tools>options. But the jobs are being executed in SQL Server by a SQL Server manager and it is using tab delimiters. We cannot login as the SQL Server manager and change the SQL Assistance defaults for that login.Has anyone changed the SQL Assistance registry for ExportDelim={tab} directly? How? Where?

TD SQL Assistant menu option? Tools ? List Columns not working properly with wildcard %.Teradata SQL Assistant -> “Tools” and then “List Columns”, database name of “XXXX” and table/view name of “%”. failed with "The Database or Table is invalid or unavailable"If I use “C%” as the tablename in the Teradata SQL Assistant Tool it will work, pulling all of the columns in all of the tables beginning with “C”. It just doesn’t work if I try to pull all of the tables “with just “%”.for some database its working fine with wildcard "table or view name" of “%”. fyi, having full access on database

Hello,I am just getting familiarized with TPT and I was trying to load a table. I am using demo version 12. Everything seems to be fine with my script and everything else, but in job state it says: completed with unexpected status from tbuild (2)Can anyone assist me with this?Thanks,Azeem Syedbelow is the script:/* 1 */ /* 2 */ /* 3 */ DEFINE JOB Movie/* 4 */ DESCRIPTION 'Loading movie table'/* 5 */ (/* 6 */ DEFINE OPERATOR W_1_o_Movie/* 7 */ TYPE LOAD/* 8 */ SCHEMA */* 9 */ ATTRIBUTES/* 10 */ (/* 11 */ VARCHAR UserName, /* 12 */ VARCHAR UserPassword, /* 13 */ VARCHAR LogTable, /* 14 */ VARCHAR TargetTable, /* 15 */ INTEGER BufferSize, /* 16 */ INTEGER ErrorLimit, /* 17 */ INTEGER MaxSessions, /* 18 */ INTEGER MinSessions, /* 19 */ INTEGER TenacityHours, /* 20 */ INTEGER TenacitySleep, /* 21 */ VARCHAR AccountID, /* 22 */ VARCHAR DateForm, /* 23 */ VARCHAR ErrorTable1, /* 24 */ VARCHAR ErrorTable2, /* 25 */ VARCHAR NotifyExit, /* 26 */ VARCHAR NotifyExitIsDLL, /* 27 */ VARCHAR NotifyLevel, /* 28 */ VARCHAR NotifyMethod, /* 29 */ VARCHAR NotifyString, /* 30 */ VARCHAR PauseAcq, /* 31 */ VARCHAR PrivateLogName, /* 32 */ VARCHAR TdpId, /* 33 */ VARCHAR TraceLevel, /* 34 */ VARCHAR WorkingDatabase/* 35 */ );/* 36 */ /* 37 */ DEFINE SCHEMA W_0_s_Movie/* 38 */ (/* 39 */ MID INTEGER,/* 40 */ NAME_2 VARCHAR(100),/* 41 */ ACTOR VARCHAR(100),/* 42 */ GENERE VARCHAR(50),/* 43 */ RATING VARCHAR(20),/* 44 */ RELEASE_YEAR SMALLINT/* 45 */ );/* 46 */ /* 47 */ DEFINE OPERATOR W_0_o_Movie/* 48 */ TYPE EXPORT/* 49 */ SCHEMA W_0_s_Movie/* 50 */ ATTRIBUTES/* 51 */ (/* 52 */ VARCHAR UserName, /* 53 */ VARCHAR UserPassword, /* 54 */ VARCHAR SelectStmt, /* 55 */ INTEGER BlockSize, /* 56 */ INTEGER MaxSessions, /* 57 */ INTEGER MinSessions, /* 58 */ INTEGER TenacityHours, /* 59 */ INTEGER TenacitySleep, /* 60 */ INTEGER MaxDecimalDigits, /* 61 */ VARCHAR AccountID, /* 62 */ VARCHAR DateForm, /* 63 */ VARCHAR NotifyExit, /* 64 */ VARCHAR NotifyExitIsDLL, /* 65 */ VARCHAR NotifyLevel, /* 66 */ VARCHAR NotifyMethod, /* 67 */ VARCHAR NotifyString, /* 68 */ VARCHAR PrivateLogName, /* 69 */ VARCHAR TdpId, /* 70 */ VARCHAR TraceLevel, /* 71 */ VARCHAR WorkingDatabase/* 72 */ );/* 73 */ /* 74 */ APPLY/* 75 */ (/* 76 */ 'INSERT INTO VIDEO_OPERATION.movie_stg (MID,NAME,ACTOR,GENERE,RATING,RELEASE_YEAR) VALUES (:MID,:NAME_2,:ACTOR,:GENERE,:RATING,:RELEASE_YEAR);'/* 77 */ )/* 78 */ TO OPERATOR/* 79 */ (/* 80 */ W_1_o_Movie[1]/* 81 */ /* 82 */ ATTRIBUTES/* 83 */ (/* 84 */ UserName = 'tduser', /* 85 */ UserPassword = 'tduser', /* 86 */ LogTable = 'VIDEO_OPERATION.movie_stg_log', /* 87 */ TargetTable = 'VIDEO_OPERATION.movie_stg', /* 88 */ TdpId = 'localtd'/* 89 */ )/* 90 */ )/* 91 */ SELECT * FROM OPERATOR/* 92 */ (/* 93 */ W_0_o_Movie[1]/* 94 */ /* 95 */ ATTRIBUTES/* 96 */ (/* 97 */ UserName = @TeradataSourceUser, /* 98 */ UserPassword = @TeradataSourcePassword, /* 99 */ SelectStmt = 'SELECT MID,NAME,ACTOR,GENERE,RATING,RELEASE_YEAR FROM VIDEO_OPERATION.movie2;', /* 100 */ TdpId = 'localtd'/* 101 */ )/* 102 */ );/* 103 */ );

Hi ppl,Is it possible to store docs in teradata database, if so can you plz let me know how to do the same?? Rgrds,Rock

hi ppl,Would like to know what exactly happens in 'ALL AMPs SUM' step ? This operation is performed in while using group by clause in the query.Refer to the 3rd step in the explain statement of the query...3)We do an all-AMPs SUM step to aggregate from database.tablename by way of an all-rows scan with no residual conditions, and the grouping identifier in field 1025. Aggregate Intermediate Results are computed locally, then placed in Spool 3. The size of Spool 3 is estimated with low confidence to be 3,930 rows. The estimated time for this step is 0.01 seconds. 4)We do an all-AMPs RETRIEVE step from Spool 3 (Last Use) by way ofan all-rows scan into Spool 1 (group_amps), which is built locallyon the AMPs. The size of Spool 1 is estimated with low confidenceto be 3,930 rows. The estimated time for this step is 0.01 seconds. Regards,Rock

Hi, I need to create 1000 of macros. I am using JDBC for the same.But i find that since we need to issue a commit after each create macro statement. if the commit is not issued, the subsequent create macro executionfails saying that "Teradata: et or commit was expected"I believe this is expensive and quite slow.Please let me knoe if there is a faster alternative to this?Attached is the sample code which is used to create a macro using JDBC. These statements are called in a loop. statement = dbConnection.createStatement(); rowCount = statement.executeUpdate(insertMacro); statement.close(); dbConnection.commit(); statement = dbConnection.createStatement(); rowCount = statement.executeUpdate(updateMacro); statement.close();Rgds,Ranga

Hi,I am facing a severe problem while loading data via fastlaodMy column data type is decimal(30,5)But my source data is having precision 20 ...(30,20)Please help me out what i should do??

Hi Forum Gurus,Please clarify what data will cause insertion in ET table and what data will halt fast load/multiload abruptly...with some example....We have seen written in manuals translation errors will be caught in ET table and uniqueness violation in UV..Can anybody please explain these by some example1)data likely to be captured in ET 2)Data likely

Hi,I want to read the user id and password from a file insted of hard coding in the MLOAD control file. Please help me how to get it.Regards,KKR

Hello,i'm running BTEQ via MVS Batch (BTQMAIN program).From yesterday, i'm getting below error for my SQL:Failure 3710 Insufficient memory to parse this request, during Optimizer phase. Statement# 1, Info =0 I figured out that it has something to do with MaxParseTreeSegs. But don't know how to use it in my case. I don't have much access to teradata which is behind the seen.Is there any way to tackle this issue. May be by including some statement at the top of my SQL query?Please suggest!Thanks,Tushar Saxena

Hi, Any cone could u please help me the below one............SEL C1,Count(*) from T1 GROUP BY 1;.IF ERRORCODE<>THEN .QUITSEL * from T2 where Status='F';.IF ACTIVITYCOUNT >0 THEN .QUIT 99logoff.QUIT.QUIT 99Note: T2 contains the Failed Records.if a=0 then i would like to execute the next IF statement and ultimately it fails with the return code 99.if a=1 then i would like to execute the SQL and i want to exit normally with Return code 0.where a is a varible and every time it has been set manually.

All,Is there a way to extract all the ddls for all the tables and views that belongs to a database. thanks in advance.pk

I am running TD 12.0 (express edition) on Windows XP w/Apache Tomcat running. I have a script file called XXX.TXT which is sitting in /ROOT/FASTLOAD/ .

I'd like to archive all databases on Teradata for Windows, with following logon script, we can successfully logon and perform further archive action easily.LOGON tdpid/user,password;ARCHIVE DATA TABLES (DBC) ALL, RELEASE LOCK, INDEXES,OUTPUT ROW COUNT, FILE = archive;LOGOFF;But the question is how to take advantage of SSO to do the job instead?

Hi, i am having mainframe FB file which contains header record(ex: CBS SYSTEM 29MAY2009) and details records of different length in which column values are separated by '~' and finally trailor record(ex; record count: 25000). I want to load this data into header table and detail table and trailor table.My question is can i load the data into all 3 tables(header record to header table, details records to detail table etc..) by using one MLOAD job?I tried by defining 3 layouts, but there is no indicator field to identify header or trailor or details.so the job got abended.One method i know is to use splitter job to separate the above file into 3 separate files and load it.Can you please suggest is there any other efficient way ,we can load the data into 3 tables?

Hi All,I have a file with 9k records which gets successfully loaded by using Traditional Fastload Utility.The same file when i try to load using Teradata Parallel Transporter LOAD Operator , all these records goes into ET table, with errorcode 2665 - Invalid date -- For one of the DATE columns.I've even tried giving FORMAT in the INSERT/SELECT part , but it still addresses that job completed successfully but all 9k records creeps into ET table. The file is a TAB delimited file.Please can you advise where i m going wrong ?Regards,Mtlrsk

Is there a way to run a COLLECT STATISTICS statement in the DDL operator? Should I use a different type of operator to COLLECT STATISTICS?R

Hi,As we know worktables are used to carry data which will be applied during application phase.......But in mload delete there is no data to be acquired so no acquistion phase...Could you please tell me what is the need of worktable in MLOAD DELETE...

Does having hash indices impact the the archive run time for a database?thx

HelloI am inserting 10 records in table 'T' and updating and deleting the record. As followsTable 'T' has 'pkcol'(primary key integer) and 'val' varchar(20) columns**********SQL**************Insert into T values(1, 'test');......Insert into T values(10, 'test');Update T Set pkcol=11 where pkcol=10;Delete from T where pkcol=11;****************************when executing this with sessions=6 and threshhold=2 in Tpump configrations script file, TPUMP JOB FAILSBut if i run this with sessions=1 and threshold=1 then it passes and data is applied.Any resolution to this problem or some workaround for making it possible with more number of sessions.Thanks & Regards

Hi,I have been trying to export xml data using fast export. My script is working fine. But i could see some junk character at the beginning of each line in my output file.

Hello all, I know the name of the SQL Assistant is Teradata SQL Assistant, but I am curious if I can connect to a SQL Server DB,using the same.Has anyone tried doing it ? Can someone help me with connecting to a SQL Server 2K Db using TD SQL Assistant ?Any help will be highly appreicated.Thank you,az_maverick

Hi,Actually,I am new to this teradata but i know basic concepts.we will create indexes when we create tables in teradata and will not mention primary keys.how to find a primary key in existing teradata table?Please let me know the solution.

Hello, Currently i am having issue's when it comes to getting results of a query from a Teradata DW server if the query takes longer then 30 seconds to run. I know that this is the default, and i did look it up in the forums but i still can't seem to figure out how to change the commandtimeout parameter. I am using .NET Data Provider for Teradata driver with Visual Studios 2005 SP1 and the query with be run in a ASP.net webpage. I am creating the queries in a Dataset(xsd file) and then binding them to a dataviewer using a Ojectdatasource. How would i go about changing the CommandTimeout default value?thanks,Matt

Hello, I am using Teradata SQL Assistant to connect to a TD server.I was able to set up a DSN for the TD Server. However in the Db explorer pane , I am unable to view tables in the tables , thought I am able to query the DB.So if I run a query , I am able to view results... When I ask it for the list of tables , views from the List Tables toolbar , I am able to view the tables , views .. procedure..HOwever when I add a the DB, using Add database ... and expand the DB to view tables , it does not show me the tables ...I was wonderign if someone could help me with this problem.Thank you already ..:)Regards,az_maverick...

Hi ppl,Can any of you explain how exactly the serialize option would help in improving performance ? also its funtionality...?Thanks,Rock

Hi ppl,In one of the articles i had come across, "pack factor and number of sessions contribute to the level of parallelism insite TD" .. can anyone tell exactly, how this happens..??? Thanks,Rock

what is the difference between Partitioned primary index and secondary indexplease tell me as early as possiblethanks & regardsk.srinivasarao

Hi,Is it possible to skip data errors during fastload and continue with loading other records from file?I have information in tab delimitted text files and in case some field is too long fastload stops to load all the data and reports the error.In Oracle sqlloader records would go to bad file and I would be able later to decide wether I would like to deal

Hi,Had a long running MLOAD which was running since 7 Hrs. Looked into TD Manager the AMP state was while PE state was . Used to find the current status of MLOAD, please find below the report generated. The report generated by Query Session is showing State Detail as and MLOAD Phase as . The ETL job completed after after sometime i generated this report.Have bit confusion as per the report how can the data loading be progress if the parent and child sessions are inactive.Can any one give me insight on the concurrent occurance of the State Detail and Mload Phase highlighted above.Thanks in advance.

HI,When I am running the following script:.LOGTABLE tduser.LOG_TBL;.LOGON DEMOTDAT/tduser,tduser;CREATE TABLE TDUSER.EMPINFO (EMPNO SMALLINT FORMAT '9(5)' BETWEEN 10001 AND 32001 NOT NULL,NAME VARCHAR(12),DEPTNO SMALLINT FORMAT '999' BETWEEN 100 AND 900,PHONENO SMALLINT FORMAT '9999' BETWEEN 1000 AND 9999,JOBTITLE VARCHAR (12),SALARY DECIMAL (8,2) FORMAT 'ZZZ,ZZZ.99' BETWEEN 1.00 AND 999000.00,YRSEXP BYTEINT FORMAT 'Z9'BETWEEN 0 AND 22,DOB DATE FORMAT 'MMMBDDBYYYY',SEX CHAR(1) UPPERCASE,RACE CHAR(1) UPPERCASE,MSTAT CHAR(1) UPPERCASE,EDLEV BYTEINT FORMAT 'Z9' BETWEEN 0 AND 22,HCAP BYTEINT FORMAT 'Z9' BETWEEN -99 AND 99)UNIQUE PRIMARY INDEX (EMPNO);.BEGIN IMPORT MLOAD TABLES TDUSER.EMPINFO WORKTABLES TDUSER.EMPTEMP ERRORTABLES TDUSER.EMPERR TDUSER.EMPERROR AMPCHECK NONE;.LAYOUT EMPLAYOT;.FIELD EMPNO 2 CHAR(9);.FIELD NAME 12 CHAR(12);.FIELD DEPTNO 25 CHAR(3);.FIELD PHONENO 29 CHAR(4);.FIELD JOBTITLE 34 CHAR (12);.FIELD SALARY 47 CHAR (9);.FIELD YRSEXP 57 CHAR (2);.FIELD DOB 60 CHAR (11);.FIELD SEX 72 CHAR (1);.FIELD RACE 74 CHAR (1);.FIELD MSTAT 76 CHAR (1);.FIELD EDLEV 78 CHAR (2);.FIELD HCAP 81 CHAR (2);.DML LABEL INSEMP;INSERT INTO EMPINFO.*;.IMPORT INFILE 'C:\EMPDTLS.INPUT' FORMAT TEXT LAYOUT EMPLAYOT APPLY INSEMP;.END MLOAD;.LOGOFF;With the input file: 10021 Brown, Jo 200 2312 Development 63000.00 20 Jan 01 1955 F G M 16 0 10001 Jones, Bill 100 5376 President 83000.00 15 Jan 01 1960 M G M 14 0 10002 Smith, Jim 100 4912 Sales 73000.00 10 Jan 01 1970 M G M 13 1 10028 Lee, Sandra 200 5844 Support 77000.00 4 Jan 01 1971 F G M 18 0 10029 Berg, Andy 200 2312 Test 67000.00 10 Jan 01 1967 M G M 15 0 10023 Ayer, John 300 4432 Accounting 52000.00 8 Jan 01 1965 M G M 13 0The positions are correct.. I mean the startpos in the FIELD command is correct in the input file.The table is getting created but I am getting the problem while loading.PLease suggest me how to go about it.

I have file with large number of fields. And i want it to load only certain fields into an empty table using fast load.While defining the input file do i need to give the layout of the entire file or is there any option to define only the required fields to load the table as like in MLOAD.

Hii have the Teradata Express Edition 12.0 — (the evaluation and development).I'm using fastload to load the data into the DB.it takes 50 minutes to load 1.5 giga of data.I load 5.3 million record. each record consists of 34 columns, and the average record length is 300 characters long.there are no duplicate rows in the table.why does it take so long?(i notice that while it's running the fastload process hardly consumes CPU)is it due to the evaluation version ? what is supposed to be the performance in a production installationthanks for any help

Hi All,I had the below req there is a source table where it contains two columns as below col1 col2laxman 25:34:56rama 64:1:34:57:56raja 1:2:5rani 4:9:0:8 I need to write an utility thru which i shud get the following outputTarget1 - table where col1 is a unique id generated w.r.t col2 of target1 and col2 are distinct id`s from src(col2)col1 col2 1 252 343 564 645 16 577 28 59 410 911 012 8and Target2 as followscol1 col2laxman 25laxman 34laxman 56rama 64rama 1rama 34rama 57rama 56raja 1raja 2raja 5rani 4rani 9rani 0rani 8 Hope u understand the rq i need the bteq script urgently or plz help me out with any suggestions

Since we have 127 DML limit per IMPORT, I am splitting across 4 IMPORT statements in single LOAD.I am concerned whether those 4 IMPORTs would execute in parallel?I want to ensure the sequential data integrity.on Similar note, Am I correct in assuming that using multiple sessions would break such sequential integrity.

Hi,I am running Windows Advanced server 2003 64 bit, and want to install TTU V12.All goes well when just installing the 32 bit versions, except that one of the applications (Perl) is a 64 bit app, and requires 64 bit ODBC driver and DSN.I have tried installing both 32 and 64 bit versions of GSS (apparently successfuly), but when I come to install the Shared ICU 64 bit it seems to overwrite the 32 bit DLLs even though I nominate a different directory, and then my 32 bit apps (eg SQL Assistant) don't work any more. I can fix SQL Assistant just by reinstalling the 32 bit Shared ICU.thanksRobert

How to make MLoad read a Variable length file, instead of a fixed width.Any positive answers welcome with an example.Thanks a ton!

Hi guys!I´m having a problem with my Mload script:------------------------------------------------------.LOGTABLE mydatabase.logtable;.LOGON ;DATABASE mydatabase;CREATE TABLE mydatabase.MYTABLE ( IDFIELD INTEGER NOT NULL, CODEFIELD INTEGER NOT NULL) PRIMARY INDEX ( COD_COLETA );.BEGIN IMPORT MLOAD TABLES MYTABLE SESSIONS 2;.LAYOUT inslayout;.FIELD IDFIELD * varchar(10); .FIELD CODEFIELD * varchar(10);.DML LABEL insdml;INSERT INTO MYTABLE .*;.IMPORT INFILE D:\Temp\Olympio\Data.txt FORMAT VARtext ',' LAYOUT inslayout APPLY insdml;.END MLOAD;.LOGOFF;------------------------------------------------------Data.txt has this data:-------------------------123456,1-----------------------This error appears:What can I do to solve this problem?Thanks for help,Anderson

Hi!I am using informtica 7.1.3 and the database is teradata 6.1.now i want to load the flat file data into teradata using fload script. how to configure my workflow settings suing teradata fload external loader.Thanks in advance cnew

Hi!I am using informtica 7.1.3 and the database is teradata 6.1.now i want to load the flat file data into teradata using fload script. how to configure my workflow settings suing teradata fload external loader.Thanks in advance cnew

Hi! I have multiple falt files say 3 flat files having same structure and these 3 files needs to be loaded into one tabe using fload. i tied but once i loaded the first table then second table i cannot load as the table needs to be empty for fload.Appreciate quick response if anyone can tell the script.????Thanks in advance