1550 - 1600 of 2359 tags for Tools

The error info :429: The monitor object(TDMon5.dll) is missing, or is not correctly registered.

Hi,Multiload will automatically creates Error table while loading from file to database. Is there any way to generate error files on unix box instep of error tables. Please help me in this case.Thank you,Regards,Kiran

Hi,I am getting below error while inserting data into table ** Error: Import data size does not agree with byte length. The cause may be: 1) IMPORT DATA vs. IMPORT REPORT 2) incorrect incoming data 3) import file has reached end-of-file.I am importing data from below file:cat /home/user/jjxyz 1abc 2BTEQ Script as below:.logon xyz/loginid,password.IMPORT DATA FILE = /home/user/jj,SKIP = 2.QUIET ON.REPEAT *USING NAME(VARCHAR(25)),ID(INTEGER)INSERT INTO DB.TABLE(NAME,ID)VALUES(:NAME,:ID);COMMIT;.LOGOFF..QUITMy table defination as:CREATE SET TABLE DB.TABLE,NO FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT ( NAME VARCHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC TITLE 'MANAGER_NAME' NOT NULL, ID INTEGER)PRIMARY INDEX ( ID );Please help me out of this.Thanks a lot.Regards,Ravi.

HiCan anybody tell me what we need to do when we are exporting1)integer2)character3)date to IBM environment(as client)..I am getting junk characters...Can anyone tell what kind of rules we need to follow when client is IBM and windows...while fast exporting the file

Hi,We have defined a workload as having classifications of < 1 Sec CPU Time and 'Include Single or Few Amp Queries Only' .Does anybody know if there is a definition of 'Few Amps' that is used ?

Hi ,I have a file with header, data racords and trailer. Data record is delimited with comma and header and trailer are fixed width.The header and trailer starts with (HDR,TRA). I need to avoid the header and trailer while loading the file with Multiload. Please help me in this case.Thank you,Regards,Kiran

Hi All,We have a requirement wherein Customer who has made purchases of at least $ 10000 in a 6 month period and made atleast a single visit a month in a 6 consecutive month period should be given a rating of A+B+. and Customer who has made purchases of at least $ 10000 in a 6 month period but not made atleast a single visit a month in a in a 6 consecutive month period should be given a rating of A+B-. sample data CUSTOMER_ID PURCHASE_DATE PURCHASE_AMOUNT101 02-MAR-07 10000101 02-APR-07 8000 101 05-MAY-07 10000 105 18-JAN-08 10000101 18-FEB-08 5000102 18-MAR-08 20000103 18-APR-08 3000102 23-MAY-08 2000104 02-JUN-08 10000103 02-JUN-08 30000101 02-JAN-08 51000With this kind of data we need to write a query to retrieve only those customers who have data in each of the last six months, like in this example it should be customer number '101'.Customer '103' should be ignored as it does not have data for last six months CONSECUTIVELYThanks & Regards,Praveen

Hi All, I need to estimate the Multiload, FastLoad, and BTEQ scripts in project. Can any body tell me that which which things will consider in complexity of each Multi load, Fastload, and BTEQ jobs.... and the time for each activity..Thanks in advance

I am calling the followingbteq .run file=/usr/wh/dev/login/login_td_procs.txt /usr/wh/dev/log/truncate_y00442p_ot_bcp_as400_extract_status.log Where /usr/wh/dev/login/login_td_procs.txt = .logon mpp/dev_data_v, ; database dev_data_v; /usr/wh/dev/sql/truncate_y00442p_ot_bcp_as400_extract_status.sql delete from ot_bcp_as400_extract_status; drop table ot_bcp_as400_extract_status_e1; drop table ot_bcp_as400_extract_status_e2; The following is the resulting log file on SuSE SLES10 +---------+---------+---------+---------+---------+---------+---------+---- .run file=/usr/wh/dev/login/login_td_procs.txt +---------+---------+---------+---------+---------+---------+---------+---- del.logon mpp/dev_data_v, ; +---------+---------+---------+---------+---------+---------+---------+---- database dev_data_v; +---------+---------+---------+---------+---------+---------+---------+---- +---------+---------+---------+---------+---------+---------+---------+---- ete from ot_bcp_as400_extract_status; +---------+---------+---------+---------+---------+---------+---------+---- drop table ot_bcp_as400_extract_status_e1; +---------+---------+---------+---------+---------+---------+---------+---- drop table ot_bcp_as400_extract_status_e2; +---------+---------+---------+---------+---------+---------+---------+---- *** BTEQ exiting due to EOF on stdin. *** Exiting BTEQ... *** RC (return code) = 2 However When run from RHELv3AS +---------+---------+---------+---------+---------+---------+---------+---- .run file=/usr/wh/dev/login/login_td_procs.txt +---------+---------+---------+---------+---------+---------+---------+---- .logon mpp/dev_data_v, *** Logon successfully completed. *** Transaction Semantics are BTET. *** Character Set Name is 'ASCII'. *** Total elapsed time was 1 second. +---------+---------+---------+---------+---------+---------+---------+---- database dev_data_v; *** New default database accepted. *** Total elapsed time was 1 second. +---------+---------+---------+---------+---------+---------+---------+---- +---------+---------+---------+---------+---------+---------+---------+---- delete from ot_bcp_as400_extract_status; *** Delete completed. One row removed. *** Total elapsed time was 1 second.

Are we able to write custom functions in fastexport?

What is the key factor that makes TPT better than other Load utilities.Given that it uses SQL like language syntax but invokes various Load utility protocols.How is it different and similar from other load utilities? Does it really, Impact performance and how?

Hi!!I am writing to see if you are currently available for a position in Ashburn,VA and Atlanta,GA.. I am looking for a Teradata Production DBA for a 12+ months contract. If you or anyone you may know of is interested, please email me at your earliest convenience.Location: Ashburn,VA and Atlanta,GA.Duration: 12+ monthsJob Description:Sr. Teradata Production DBA with Teradata Datawarehouse experience, who will be responsible for design, installation, configuration, backup, recovery, performance tuning, replication, active data warehousing.He should be strong in database designing, tuning, and troubleshooting. Ability to write complex stored procedures Understand SOX requirements pertaining to Datawarehousing.Familiarity with SQL.Experience with Net Backup and Arc utilities; BTEQ and Unix Scripting languages.Ability to create and understand data models.Skills:AIX, Applications, Computer, Database, Data Warehouse, DataWarehouse, DBA, Engineering, Linux, Management, Performance, Solaris, SQL, System, Systems, Teradata, UnixSincerely,RaadGlobal Resource Management, Inc. (GRMI)678 935 0437 (Direct)770 729 9222 (Fax)raad@grmi.netwww.grmi.net"GRMI- Not Just A Name A Commitment to Excellence"DiversityBusiness.com's Awards Information 2007 Award Rank Press ReleaseTop Women Owned Businesses in America (Div500Women-2007) 79 DownloadTop subcontinent Asian American Owned Businesses in America (Div100SAA-2007) 22 DownloadTop Asian American Owned Businesses in America (Div100ASA-2007) 48 DownloadTop Diversity Owned Businesses in America (Div500-2007) 154 DownloadTop Small Businesses in America (Div500SmallBiz-2007) 116 DownloadTop Small Business in Georgia (Div100SmallBiz-2007) 8 DownloadTop Diversity Owned Businesses in Georgia (Div100State-2007) 10 DownloadTop Woman Owned Business in Georgia (Div100WomenState-2007) 7 Download

Hi I am running fastexport with various combinations and getting different results which i am unable to understand1).EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (coalesce(Wrls,'') (char(10))) ||(coalesce(Wrls,'') (char(15))) FROM BSAMPLE 1;OUTPUT---SA-ABCDGroup Mechanics2).EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (coalesce(Wrls,'') (char(10))) ||',' ||(coalesce(Wrls,'') (char(15))) FROM BSAMPLE 1;OUTPUT----> --SA-ABCD,Group Mechanics(junk characters at beginning)3) .EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (COALESCE(TRIM(Wrls) , ' ')) ||','||(coalesce(TRIM(Wrls) ,'')) FROM BSAMPLE 1;Output-->SA-ABCD,Group MechanicsBasically it seems like(COALESCE(TRIM(Wrls_Actvty_Cd) , ' '))=(coalesce(Wrls_Actvty_Cd,'') (char(10))) But how?i am not able to understand...Please explain me if i have five cols in my table1)integer2)date3)character4)varchar5)numericPlease explain what data conversions we need to do at time of exporting the file...

Hi, I have two fast export scripts to make. The first one will query the table and return multiple records saved in an output file. The second one will summarize the data, one of the fields inside the second fast export script is the summary (count) of the no. of records obtained from the fast export. To do this, I am thinking of counting the number of lines inside the output file of fast export #1 (wc -l) and assigning it to a variable (this is in ksh). I want to be able then, to get the value of that variable during runtime of the fast export #2. Is there a way to get the value of that variable? or is there an alternative way for me to count the no. of records (1 record =1line) from the output file of fast export #1 and use that in fast export #2?

Hi all, I am a beginner to Teradata here, I wonder if this question had been asked before, that is is there any best known format to be used for TPUMP and Multiload that will give some edge to the performance of both these TTUs? Thanks

Hi AllI know we can get MLoad Script auto generated using Fexp Are there any similar short cuts to generate Tpump or Fast load script instead of coding entire thing?

Hi,my requirement is , I am getting some files every day to unix server and I need to load them to teradata tables using multiload . I need to develop a shell script which will read the files from the particuler directory and pass to multiload . please can any one help in this how to pass the shell variable to multiload script.

Hi All,Please can anyone let me know the points to remember while Fastexporting data from Mainframes system and loadingthe data back onto a Non Mainframe system using Fastload/Multiload.?I have Fastexported a file using fastload format and MLscript option from Mainframe.Now i am trying to load the same file onto a Linux system but getting format incompati


Hi all,Ina bteq ,I need to pick value from a query assign that to a variable and using same variable in if/else condition.can any bdy suggest how to assign value toa variable and use if condition there.thanks in advance.Kapil

Hi I am trying to Run Multiload by exporting date in a file and then using it...Following is the code i am running.EXPORT REPORT DDNAME DELDATE sel end_date (title '') from tableA;.EXPORT RESET while using in MULTILOAD i am giving as .ACCEPT DELDTE FROM FILE DELDATE;DELETE FROM TABLE B WHERE CURRENT_DATE<(&DELDTE-10);but it is replacing it asDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-10);And coming out with an error.but it shud beDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-01-23 -10);Please clarify how to export a date format correctly and use it in MLOAD USING ACCEPT OPTION.

I need to export data from teradata and import it in to mysql [cluster]. I was wondering if anyone has an example of fastexp script which creates mysql import friendly files.Thanks.Vadim

Hi,i have to export repeatidly some 30 M lines in a "text" format and variable length if possible.Fastexport (V2R6) adds a two-bytes indicator (as specified in the Fastexport doc.) at the beginning of the records, wathever the mode, Indicator or Record.Does anybody know a way not to keep those two bytes ?I tried to cast the whole chain - a concatenation o

Hi All,I need to create a copy of Database into a newly created database. My old data base has around 200 elements comprising of macros, stored procedures and tables/ views. Is there any teradata utility available to do it. As it will take a long time to copy each individual element.It's an urgent requirement.Thanks in advance

When I run a query using the TdCommand.ExcuteReader I am getting back an Integer when the type should be TdTime. The data type on the field in question is time within the TeraData DB. I have also set "Enable TdDateTime = true;" in the connection string. Please advise.

hi i have install teradata demo . i want to connect to the server . then one error is coming . "odbc 28000: not enough information for log on " plz help me to resolving that error Neeraj shukla

Hello I fast exported as following.logtable retail.restartlog1_fxp;.logon localtd/dbc,dbc;.begin export sessions 4;.export outfile C:\Users\test\Documents\retail14.txt;sel * from retail.area_3;.end export;.logoff;and the output it gave me is p

Hi!I tried installing Teradata Express Edition 12.0 on a Windows Server 2008 R2 server, but it did not work.The installer did not recognize the operating system and claims: "Your Microsoft Windows version is not listed above.

We need to use bteq on one of our unix boxes. DNS configuration of the box doesn't use /etc/hosts file, just name server and I have problem connecting to TD. From the box I can ping TD server, ssh to it, telnet to the td port, etc. all just fine. But if I try to connect using bteq I am getting: *** CLI error: MTDP: EM_NOHOST(224): name not in HOSTS file or names database. *** Return code from CLI is: 224 *** Error: Logon failed!What are we missing? what is "names database" and how can I configure it.Thanks,Vadim

I had a bright idea and wrote a Unix shell script that requires a Teradata logon and password as parameters. I thought I could then could substitute that logon on and password into my fastload script without saving it into a file anywhere. However my attempt with sed failed as fastload thinks the sed statement the file. Has anyone managed to overcome this?This is what I did:First I tried the statement below on its own, which works, outputting the fastload script with the logon and password in the right place (logonid was the dummy word I had put after .logon and the IP address before the ;sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txt Then I tried it with fastload commandcagcrd:BADEV:/badata/wtdata/scripts >fastload < sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txtA file or directory in the path name does not exist.ksh: sed: 0403-016 Cannot find or open the file.Reading up on UNIX it looks like redirection does not let you do this but I have hopes that someone out there knows a way around it.Regards,Susan

i need to know, i can login but why sessions timeout ? i have to fix souce code or not ?or this 's error from myCom , Anybody can help me U___UBR,mnstr.

Hi,I'm trying load a UTF16 format file (tab delimited file exported by Cognos) in a Teradata table using fastload on windows. Script is coded to read from the 2nd record onwards. Fastload is failing with the below error:**** 17:25:54 Not enough fields in vartext data record number: 2**** 17:25:54 File record count**** 17:25:54 Value for Record statement may be too large We tried changing the session charset to UTF16 but it did not work. However, I was able to load the file after copying the data in a new text file and using that in fastload. Could you please suggest how I can get the original file working ? suggestions are welcome..

Hi ALL, can anyone tell me How to retrieve column values by using DBC.columns in Fast export script and import those values into table by using Multi load script as i was able to retrieve only column names by using DBC.columns.

Hi All,I have a script which contains Select,Delete ,update and insert statements.To export the result of a select statement in BTEQ i can give as.export report file =filenamesel * from table;.export resetBut how can i export the result (number of records)of a delete/insert/update statement in BTEQ:For eg:delete from table where column1=1;I want the num

hii want to connect to the server . then error occur "160: specified deriver couldn't load because of system error 126"i got one solution for it reinstall the teradata again . i did but now i am getting error in uninstallation"unable to locate the installation log file uninst.isu". plz help meThanks & regardNeeraj shukla

Hi,I'd like to upload a file which appears to be poorly formatted.The file supposed to be seven columns wide, but turns out to be wider and narrower than that.When it's wider FastLoad simply ignores additional columns.

1.I`ve tried fastexport1.1. if I use varchar field[font=Courier New]select .. || ',' || ... from ...[/font]I have always n-bytes at the beginning of each row telling how long will be data for a row (I`ve tried format = text/unformat and mode = record/indicator)I cannot have anything before a row.1.2. if I use char field [font=Courier New]select .. || ',' || (char(100)) ... from ...[/font]and format = text mode = record it seems to be ok but each record has 100 bytes (in the future it will be n-kilobytes long) and my file will become very large for large amount of records.2.I`ve used bteq command2.1 but I`ve read that it is not so optimal for large amount of records.The best idea would be something like 1.2 but without empty string at the end (I can add something like CRLF at the end of my concatenated string).thx for any answers,oen

I config IIS7 follow picture IIS.jpg (File Attachments) and when I install SQLAssistantWeb12.0 I get error in picture errorSQLWeb.jpg (File Attachments) How to config IIS7 on Windows Vista before install SQLAssistantWeb12.0 ?

I would like to suppress the column heading and underscore output from the following BTEQ script. Please could you let me know how to do this, the manual seems to omit this option. I can get round it using the tail command in UNIX but it would look more professional if it could be done in BTEQ..run file logon.txt.export report file=/badata/wtdata/tdata_count.txt.set format offselect count(*) from TABLEA;.EXPORT RESET.LOGOFF.EXIT Count(*)----------- 1186By the way I notice my file is written to incrementally every time I run the BTEQ script. Is there a command to overwrite the content on every run? Again I can work round this by deleting the file in the Unix script at the start of the run but it would look better to do this in one hit with the BTEQ script.Thanks for your help,Susan

Hi, I want to write a .vbs to let BTEQ excut SQL files. But I don't know how to get return code after BTEQ excuted SQL.

I am running fastload scripts and BTEQ scripts from a UNIX shell script. Everything was going fine until I inserted a test to make sure there was a return code of 0. My fastload script, in the fastload output, showed a return code of 0 but Unix is picking up 1. My BTEQ script is, deliberately, falling over with a return code of 8 but is returning 0 to Unix. How can I make BTEQ and Fastload pass their return codes to UNIX?This is the part of my shell script that is picking up a 1 from a complete OK fastload script fastload < $vScriptName if (($? != 0)) echo 'error ' $? ' in fastload script' then vErrorText="script $vScriptNameSuffix failed" ShowError fiHere is the end of the output that shows 0 is the return code but 1 is coming back to UNIX. . Highest return code encountered = '0'.**** 16:42:22 FDL4818 FastLoad Terminatederror 1 in fastload scriptError: script FLOAD_DIM_BRANCH.txt failedMany thanks for any help,Susan

Is there a way(other than .run file) to use loops to perform an iteration in BTEQ and quit based on a condition?

when I run my perl script, the error always occur . the message as follow:.LOGON TU_CBIC, *** CLI error: MTDP: EM_ERRFAILPCLGSSRSP(242): error or failure gss/sso rsp. *** Return code from CLI is: 242 *** Error: Logon failed! *** Total elapsed time was 45 seconds.---------------------------------------when I run again, the message change to :.LOGON TU_CBIC, *** Error 3004 User identification is not authorized. *** Error: Logon failed! *** Total elapsed time was one minute and 6 seconds. How can I fix it? pls help me!

Hi,I've setup Teradata Manager and the query scheduler on a single desktop acting as a client. I've almost got everything I need (for now) working except for the email notification portion on the Query Scheduler. I can schedule a query and it runs and I can also store it on the local drive and all...but I cannot get the email notification to work.I suspect, I missed a step or a requirement. Do I need some sort of SMTP server to point to or something? If so, I can't seem to find the screen that asks for this. The "client" is just a Win XP desktop connected to the network. Help anyone?

Can anyone help me with this? I want the field names from the query results to show up, not, "A", "B", "C", etcI'm using:SQL Assistant v7.2Server is 6.2.0250 v2rDriver is

Hi Everyone,Can we import/export the data to/from the teradata server using BTEQ commands in interactive mode. I know that it can be done for sure in batch mode. If we can in interactive mode, the BTEQ script is same as for batch mode?Please clarify me.Thanks,Gopi

I am attempting to login once to Fastload and then run a number of Fastload scripts to load different data to different tables. My test was just to merge two scripts together, dropping the logoff from the first script and the .logon statement from the second script. My first error message told me I should not have more than one vartext in a fastload job so I deleted the second vartext statement. Then I had the message that the file (DDNAME) or INMOD was already specified. I resolved that inserting the CLEAR command just before the define statement.Finally, the error message I could not resolve - an error objecting to the second table or error files.0027 BEGIN LOADING DIM_DAY ErrorFiles DIM_DAY_ERR1, DIM_DAY_ERR2;**** 10:30:06 RDBMS error 2635: Error tables are invalid OR DIM_DAY is in an invalid state for Load -There is nothing wrong with these tables - if I run the second script independently, with a login, it runs cleanly. Has anyone any suggestions or is it really true that you cannot load multiple tables with one logon into fastload. I thought I would be saving time with one login, plus I am trying to working out a secure way of running fastload scripts via a UNIX shell script. My idea on security is that when running automatically from the root directory the job will pick up the logon from a secure file using .RUN plus the file name, and when someone is running manually they have to supply a logon and password which then gets put in a file, only readable by that user, used by the fastload scripts instead of the logon in the secure file. I thought the problems would be easier to crack with one login rather than having to pull it in multiple time to multiple scripts. If anyone has written anything like this could you possible let me have the solution? Advice would be gratefully received. My Unix knowledge is failrly basic and I am new to Fastload although rapidly improving with much use of the Teradata reference manuals.Regards,Susan

Hi,I am trying to load oracle table named TST1 and its data into teradata v12.0. I am using OleLoad. Datatypes in pl sql for CONFIG_ID and MD_SOURCE_SYSTEM are NUMBER(15) and rest of them varchar2.when i run the OleLoad tool it does following:.SET SESSION CHARSET "UTF8";LOGON localtd/tduser,tduser ;DATABASE samples ;CREATE TABLE "TST1" ( CONFIG_ID DECIMAL(15,0) NOT NULL, MD_SOURCE_SYSTEM DECIMAL(15,0) NOT NULL, CONFIG_VALUE VARCHAR(50) CHARACTER SET UNICODE, CONFIG_DESCRIPTION VARCHAR(100) CHARACTER SET UNICODE, UPDATABLE_FLAG VARCHAR(1) CHARACTER SET UNICODE ) ;BEGIN LOADING "TST1" ERRORFILES TST1_errors1, TST1_errors2 INDICATORS ;AXSMOD Oledb_Axsmod "noprompt";DEFINE CONFIG_ID (DECIMAL(15,0)), MD_SOURCE_SYSTEM (DECIMAL(15,0)), CONFIG_VALUE (VARCHAR(200)), CONFIG_DESCRIPTION (VARCHAR(400)), UPDATABLE_FLAG (VARCHAR(4)) INSERT INTO "TST1" ( CONFIG_ID, MD_SOURCE_SYSTEM, CONFIG_VALUE, CONFIG_DESCRIPTION, UPDATABLE_FLAG ) VALUES ( :CONFIG_ID, :MD_SOURCE_SYSTEM, :CONFIG_VALUE, :CONFIG_DESCRIPTION, :UPDATABLE_FLAG ) ;END LOADING ;LOGOFF ;and gives me this error for CONFIG_ID and MD_SOURCE_SYSTEM columns: Failure copying the source. Precision specified in the data value doesnt match the precision expected for this type.It works fine for data type Number without any precision in oracle. when i have number(value) i get this kind of error.I cannot change the datatypes in oracle. So, can some one please suggest me how to load the data into a teradata table.

Hi all ,can anyone tell me how to do formatting of data type's in the output filed section at fast export script.Thanks and Regards Ramya

For example, how can I convert text "032" which represents an ordinal date to "02/01" for February 1st?Thanks,