1550 - 1600 of 2339 tags for Tools

Hi I am trying to Run Multiload by exporting date in a file and then using it...Following is the code i am running.EXPORT REPORT DDNAME DELDATE sel end_date (title '') from tableA;.EXPORT RESET while using in MULTILOAD i am giving as .ACCEPT DELDTE FROM FILE DELDATE;DELETE FROM TABLE B WHERE CURRENT_DATE<(&DELDTE-10);but it is replacing it asDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-10);And coming out with an error.but it shud beDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-01-23 -10);Please clarify how to export a date format correctly and use it in MLOAD USING ACCEPT OPTION.

I need to export data from teradata and import it in to mysql [cluster]. I was wondering if anyone has an example of fastexp script which creates mysql import friendly files.Thanks.Vadim

Hi,i have to export repeatidly some 30 M lines in a "text" format and variable length if possible.Fastexport (V2R6) adds a two-bytes indicator (as specified in the Fastexport doc.) at the beginning of the records, wathever the mode, Indicator or Record.Does anybody know a way not to keep those two bytes ?I tried to cast the whole chain - a concatenation o

Hi All,I need to create a copy of Database into a newly created database. My old data base has around 200 elements comprising of macros, stored procedures and tables/ views. Is there any teradata utility available to do it. As it will take a long time to copy each individual element.It's an urgent requirement.Thanks in advance

When I run a query using the TdCommand.ExcuteReader I am getting back an Integer when the type should be TdTime. The data type on the field in question is time within the TeraData DB. I have also set "Enable TdDateTime = true;" in the connection string. Please advise.

hi i have install teradata demo . i want to connect to the server . then one error is coming . "odbc 28000: not enough information for log on " plz help me to resolving that error Neeraj shukla

Hello I fast exported as following.logtable retail.restartlog1_fxp;.logon localtd/dbc,dbc;.begin export sessions 4;.export outfile C:\Users\test\Documents\retail14.txt;sel * from retail.area_3;.end export;.logoff;and the output it gave me is p

Hi!I tried installing Teradata Express Edition 12.0 on a Windows Server 2008 R2 server, but it did not work.The installer did not recognize the operating system and claims: "Your Microsoft Windows version is not listed above.

We need to use bteq on one of our unix boxes. DNS configuration of the box doesn't use /etc/hosts file, just name server and I have problem connecting to TD. From the box I can ping TD server, ssh to it, telnet to the td port, etc. all just fine. But if I try to connect using bteq I am getting: *** CLI error: MTDP: EM_NOHOST(224): name not in HOSTS file or names database. *** Return code from CLI is: 224 *** Error: Logon failed!What are we missing? what is "names database" and how can I configure it.Thanks,Vadim

I had a bright idea and wrote a Unix shell script that requires a Teradata logon and password as parameters. I thought I could then could substitute that logon on and password into my fastload script without saving it into a file anywhere. However my attempt with sed failed as fastload thinks the sed statement the file. Has anyone managed to overcome this?This is what I did:First I tried the statement below on its own, which works, outputting the fastload script with the logon and password in the right place (logonid was the dummy word I had put after .logon and the IP address before the ;sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txt Then I tried it with fastload commandcagcrd:BADEV:/badata/wtdata/scripts >fastload < sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txtA file or directory in the path name does not exist.ksh: sed: 0403-016 Cannot find or open the file.Reading up on UNIX it looks like redirection does not let you do this but I have hopes that someone out there knows a way around it.Regards,Susan

i need to know, i can login but why sessions timeout ? i have to fix souce code or not ?or this 's error from myCom , Anybody can help me U___UBR,mnstr.

Hi,I'm trying load a UTF16 format file (tab delimited file exported by Cognos) in a Teradata table using fastload on windows. Script is coded to read from the 2nd record onwards. Fastload is failing with the below error:**** 17:25:54 Not enough fields in vartext data record number: 2**** 17:25:54 File record count**** 17:25:54 Value for Record statement may be too large We tried changing the session charset to UTF16 but it did not work. However, I was able to load the file after copying the data in a new text file and using that in fastload. Could you please suggest how I can get the original file working ? suggestions are welcome..

Hi ALL, can anyone tell me How to retrieve column values by using DBC.columns in Fast export script and import those values into table by using Multi load script as i was able to retrieve only column names by using DBC.columns.

Hi All,I have a script which contains Select,Delete ,update and insert statements.To export the result of a select statement in BTEQ i can give as.export report file =filenamesel * from table;.export resetBut how can i export the result (number of records)of a delete/insert/update statement in BTEQ:For eg:delete from table where column1=1;I want the num

hii want to connect to the server . then error occur "160: specified deriver couldn't load because of system error 126"i got one solution for it reinstall the teradata again . i did but now i am getting error in uninstallation"unable to locate the installation log file uninst.isu". plz help meThanks & regardNeeraj shukla

Hi,I'd like to upload a file which appears to be poorly formatted.The file supposed to be seven columns wide, but turns out to be wider and narrower than that.When it's wider FastLoad simply ignores additional columns.

1.I`ve tried fastexport1.1. if I use varchar field[font=Courier New]select .. || ',' || ... from ...[/font]I have always n-bytes at the beginning of each row telling how long will be data for a row (I`ve tried format = text/unformat and mode = record/indicator)I cannot have anything before a row.1.2. if I use char field [font=Courier New]select .. || ',' || (char(100)) ... from ...[/font]and format = text mode = record it seems to be ok but each record has 100 bytes (in the future it will be n-kilobytes long) and my file will become very large for large amount of records.2.I`ve used bteq command2.1 but I`ve read that it is not so optimal for large amount of records.The best idea would be something like 1.2 but without empty string at the end (I can add something like CRLF at the end of my concatenated string).thx for any answers,oen

I config IIS7 follow picture IIS.jpg (File Attachments) and when I install SQLAssistantWeb12.0 I get error in picture errorSQLWeb.jpg (File Attachments) How to config IIS7 on Windows Vista before install SQLAssistantWeb12.0 ?

I would like to suppress the column heading and underscore output from the following BTEQ script. Please could you let me know how to do this, the manual seems to omit this option. I can get round it using the tail command in UNIX but it would look more professional if it could be done in BTEQ..run file logon.txt.export report file=/badata/wtdata/tdata_count.txt.set format offselect count(*) from TABLEA;.EXPORT RESET.LOGOFF.EXIT Count(*)----------- 1186By the way I notice my file is written to incrementally every time I run the BTEQ script. Is there a command to overwrite the content on every run? Again I can work round this by deleting the file in the Unix script at the start of the run but it would look better to do this in one hit with the BTEQ script.Thanks for your help,Susan

Hi, I want to write a .vbs to let BTEQ excut SQL files. But I don't know how to get return code after BTEQ excuted SQL.

I am running fastload scripts and BTEQ scripts from a UNIX shell script. Everything was going fine until I inserted a test to make sure there was a return code of 0. My fastload script, in the fastload output, showed a return code of 0 but Unix is picking up 1. My BTEQ script is, deliberately, falling over with a return code of 8 but is returning 0 to Unix. How can I make BTEQ and Fastload pass their return codes to UNIX?This is the part of my shell script that is picking up a 1 from a complete OK fastload script fastload < $vScriptName if (($? != 0)) echo 'error ' $? ' in fastload script' then vErrorText="script $vScriptNameSuffix failed" ShowError fiHere is the end of the output that shows 0 is the return code but 1 is coming back to UNIX. . Highest return code encountered = '0'.**** 16:42:22 FDL4818 FastLoad Terminatederror 1 in fastload scriptError: script FLOAD_DIM_BRANCH.txt failedMany thanks for any help,Susan

Is there a way(other than .run file) to use loops to perform an iteration in BTEQ and quit based on a condition?

when I run my perl script, the error always occur . the message as follow:.LOGON TU_CBIC, *** CLI error: MTDP: EM_ERRFAILPCLGSSRSP(242): error or failure gss/sso rsp. *** Return code from CLI is: 242 *** Error: Logon failed! *** Total elapsed time was 45 seconds.---------------------------------------when I run again, the message change to :.LOGON TU_CBIC, *** Error 3004 User identification is not authorized. *** Error: Logon failed! *** Total elapsed time was one minute and 6 seconds. How can I fix it? pls help me!

Hi,I've setup Teradata Manager and the query scheduler on a single desktop acting as a client. I've almost got everything I need (for now) working except for the email notification portion on the Query Scheduler. I can schedule a query and it runs and I can also store it on the local drive and all...but I cannot get the email notification to work.I suspect, I missed a step or a requirement. Do I need some sort of SMTP server to point to or something? If so, I can't seem to find the screen that asks for this. The "client" is just a Win XP desktop connected to the network. Help anyone?

Can anyone help me with this? I want the field names from the query results to show up, not, "A", "B", "C", etcI'm using:SQL Assistant v7.2Server is 6.2.0250 v2rDriver is

Hi Everyone,Can we import/export the data to/from the teradata server using BTEQ commands in interactive mode. I know that it can be done for sure in batch mode. If we can in interactive mode, the BTEQ script is same as for batch mode?Please clarify me.Thanks,Gopi

I am attempting to login once to Fastload and then run a number of Fastload scripts to load different data to different tables. My test was just to merge two scripts together, dropping the logoff from the first script and the .logon statement from the second script. My first error message told me I should not have more than one vartext in a fastload job so I deleted the second vartext statement. Then I had the message that the file (DDNAME) or INMOD was already specified. I resolved that inserting the CLEAR command just before the define statement.Finally, the error message I could not resolve - an error objecting to the second table or error files.0027 BEGIN LOADING DIM_DAY ErrorFiles DIM_DAY_ERR1, DIM_DAY_ERR2;**** 10:30:06 RDBMS error 2635: Error tables are invalid OR DIM_DAY is in an invalid state for Load -There is nothing wrong with these tables - if I run the second script independently, with a login, it runs cleanly. Has anyone any suggestions or is it really true that you cannot load multiple tables with one logon into fastload. I thought I would be saving time with one login, plus I am trying to working out a secure way of running fastload scripts via a UNIX shell script. My idea on security is that when running automatically from the root directory the job will pick up the logon from a secure file using .RUN plus the file name, and when someone is running manually they have to supply a logon and password which then gets put in a file, only readable by that user, used by the fastload scripts instead of the logon in the secure file. I thought the problems would be easier to crack with one login rather than having to pull it in multiple time to multiple scripts. If anyone has written anything like this could you possible let me have the solution? Advice would be gratefully received. My Unix knowledge is failrly basic and I am new to Fastload although rapidly improving with much use of the Teradata reference manuals.Regards,Susan

Hi,I am trying to load oracle table named TST1 and its data into teradata v12.0. I am using OleLoad. Datatypes in pl sql for CONFIG_ID and MD_SOURCE_SYSTEM are NUMBER(15) and rest of them varchar2.when i run the OleLoad tool it does following:.SET SESSION CHARSET "UTF8";LOGON localtd/tduser,tduser ;DATABASE samples ;CREATE TABLE "TST1" ( CONFIG_ID DECIMAL(15,0) NOT NULL, MD_SOURCE_SYSTEM DECIMAL(15,0) NOT NULL, CONFIG_VALUE VARCHAR(50) CHARACTER SET UNICODE, CONFIG_DESCRIPTION VARCHAR(100) CHARACTER SET UNICODE, UPDATABLE_FLAG VARCHAR(1) CHARACTER SET UNICODE ) ;BEGIN LOADING "TST1" ERRORFILES TST1_errors1, TST1_errors2 INDICATORS ;AXSMOD Oledb_Axsmod "noprompt";DEFINE CONFIG_ID (DECIMAL(15,0)), MD_SOURCE_SYSTEM (DECIMAL(15,0)), CONFIG_VALUE (VARCHAR(200)), CONFIG_DESCRIPTION (VARCHAR(400)), UPDATABLE_FLAG (VARCHAR(4)) INSERT INTO "TST1" ( CONFIG_ID, MD_SOURCE_SYSTEM, CONFIG_VALUE, CONFIG_DESCRIPTION, UPDATABLE_FLAG ) VALUES ( :CONFIG_ID, :MD_SOURCE_SYSTEM, :CONFIG_VALUE, :CONFIG_DESCRIPTION, :UPDATABLE_FLAG ) ;END LOADING ;LOGOFF ;and gives me this error for CONFIG_ID and MD_SOURCE_SYSTEM columns: Failure copying the source. Precision specified in the data value doesnt match the precision expected for this type.It works fine for data type Number without any precision in oracle. when i have number(value) i get this kind of error.I cannot change the datatypes in oracle. So, can some one please suggest me how to load the data into a teradata table.

Hi all ,can anyone tell me how to do formatting of data type's in the output filed section at fast export script.Thanks and Regards Ramya

For example, how can I convert text "032" which represents an ordinal date to "02/01" for February 1st?Thanks,

We are using OLE DB Provider for uploading the data from SQL server 2005 to Teradata using AMJ files, etc. While uploading the data which has some junk chars but that was accepted by SQL server 2005 failed during MLoad Acquisition phase. Can anyone help me regarding this?

Hi My requirement is to do collect stats on a particular table once in 1 week....For that i want to check when was last stats done.....So that i can run confirm from help stats that when was last stats done and use something like:.IF MAX(DATE) >= CURRENT_DATE-6...

I use below script basically bteq and then export to flat file. It gives error as below, but geneartes files with control characters in the beginning. Can someone help please on removing control characters and making below script run. .BEGIN EXPORT SESSIONS 1; *** Error: Unrecognized command BEGIN.END EXPORT; *** Error: Unrecognized command ENDScript#!/bin/kshbteq<<bteqend.SESSIONS 1;.logon,xxx123;DATABASE SDR_ACCESS_VIEWS;CREATE VOLATILE TABLE MY_SUPV AS(SELECT DISTINCT COALESCE(cast(ATTRIB.SBC_USER_ID as varchar(20)),'') as Supv_Attuid, ATTRIB.SALES_RES_ID as Supv_srid, COALESCE(cast(ATTRIB.FIRST_NM as varchar(40)),'') as Supv_FirstName, COALESCE(cast(ATTRIB.MIDDLE_NM as varchar(40)),'') as Supv_MiddleName, COALESCE(cast(ATTRIB.LAST_NM as varchar(40)),'') as Supv_LastNameFROM SDR_ACCESS_VIEWS.VCCR0H8S_SALES_RES_ATTR ATTRIB, SDR_ACCESS_VIEWS.VCCR0H5S_ORG_NODE_SUPVSR SUPV WHERE ATTRIB.SALES_RES_ID = SUPV.SALES_RES_ID AND ATTRIB.EFF_END_DT >= date AND ATTRIB.DATA_SNAPSHOT_DT = date AND ATTRIB.ENTITY_DB_STATUS_ON_SNAPSHOT = 'C' AND SUPV.EFF_END_DT >= date AND SUPV.DATA_SNAPSHOT_DT = date AND SUPV.ENTITY_DB_STATUS_ON_SNAPSHOT = 'C')WITH DATAUNIQUE PRIMARY INDEX (Supv_srid)ON COMMIT PRESERVE ROWS;.BEGIN EXPORT SESSIONS 1;.EXPORT DATA FILE = hierarchy.txt;SELECT COALESCE(cast(trim(S1) as varchar(20)),'') || COALESCE(cast(S2 as varchar(20)),'')|| COALESCE(cast(S3 as varchar(40)),'') || COALESCE(cast( S4 as varchar(20)),'')|| COALESCE(cast(S5 as varchar(40)),'')from MY_SUPV ;.EXPORT DATA FILE = my.dat;bteqend

multiload script hangs in acquisition phase...does not quit or do anything else ..remains in that state for hours...if we kill the session and run mload again it runs fine...can anyone throw some light on why this would happen??

I have column with '1111.'. I need to replace . with ''. How can I do this?

I want to write fexp script where I will be connecting to server 'xxx.xx.xx' with uid and pwd. i know about .logon file where I can specify uid and pwd. But where can I specify server name?

Hello,We have divided several of our backups into three tape groups on our BAR server which back up to a SAN. Groups 2 and 3 are working properly, however the 1st tape fails with error below. Our backup scripts have not changed and all three groups were running successfully a few days ago. Any help is appreciated. 12/26/2008 00:15:38 *** Failure ARC0805:Access Module returned error code 34: Error BAM1008: Duplicate storage objects found with NameSpace=Teradata, Path=TAPE1, Segment=F0000, Generation=812.Thank yourrenn001

I'd like to use BTEQ v12 to load an XML data file (filesize > 1MB) into a CLOB column. Is this possible? I haven't been able to find the proper syntax. I have done this using the (unsupported?) LOBTEQ utility.

Hi All,Need your urgent help in this,I have received a file which was Fastexported from Mainframe Platformusing Fastload Format.Now i want to load this onto Teradata Box via Linux Platform.On linux , i am using fastload format and loading using Multiload utility.I am getting the error :-**** 13:25:05 UTY4015 Access module error '16' received during 'read' operationon record number '0': 'Unexpected data format'Please can you advise what can be the possible cause.?Regards,mtlrsk.

I am trying PPICREATE SET TABLE Employee(Employee_Number INTEGER NOT NULL,Location_Number INTEGER,Dept_Number INTEGER,Emp_Mgr_Number INTEGER,Job_Code INTEGER,Last_Name CHAR(20),First_Name VARCHAR(20),Salary_Amount DECIMAL(10,2))UNIQUE PRIMARY INDEX (Employee_Number)PARTITION BY Location_Number;It is throwing an error ----Location_Number shud be a part of PI for its uniqueness....I have read that Location_Number should be a part of PI to make it UNIQUE....But not able to figure out why....Can anyone please explain why Location_Number needs to be included although Employee_Number is unique....Please provide some example....

I was doing some quarterly analysis on our Production system and noticed that DBCMANAGER is doing a lot of logon/logoffs (About 30 times per minute). I was wondering if this is normal behavior or if there is some settings I am failing to configure.2008-12-19 14:27:06.79 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:06.90 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:07.04 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:07.31 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:08.09 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:08.18 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:16.93 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:18.03 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:18.15 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:18.25 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:25.96 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:27.06 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:27.20 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:27.29 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:35.03 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:35.14 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:35.28 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:35.37 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:36.06 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:37.15 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:37.28 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:37.39 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:46.10 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:47.21 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:47.34 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:47.43 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:56.23 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:57.32 DBCMANAGER $H-DBC-MANAGER Logoff 2008-12-19 14:27:57.46 DBCMANAGER $H-DBC-MANAGER Logon 2008-12-19 14:27:57.57 DBCMANAGER $H-DBC-MANAGER Logoff I have configured all of the Data Collection schedules (only one heartbeat once per hour) and have a few TDManager Alerts set up as well. The Collection rates are Node 60sec, VProc 60sec, Session 60sec.There are 2 sessions logged on all of the time under the ISSERVICE logon source (One for MONITOR partition and one for DBC/SQL partition.I am still pretty new at this ( using TDMAN for about 6 months) so any information would be helpful and appreciated.Thanks,

Hi,By default Fastload treats each line in input file as one record.Is it possible to configure Fastload, so that we can define a special character and Fastloadwould read everything until that special character as that belonging to one record?Ex: (Sample flat file - Special character : &&abc|def|12|amb-09|asd|ed|ff|&&abc|def||||||&&nn|rt||||aa|aa|aa|&&the above should translate into 3 records in target table.Please let me know if this is possible with a Fastload?Regards,Annal T

Apologize if this is a simple answer, but I am trying to use SQL assistant version to run queries against an Oracle database.I have a user who can do this with an older version of sql assistant, but i have defined my connection the same way as my user's and while i can connect to the Db, it will not let me open a view of the table or run any quer

I am new to teradata and is currently working on accessing teradata server to fetch records from a particular table in to a text file. Have installed teradata sql client on my workstation (windows xp) and am able to run sql queries in sql client and as well to a text file.

SET WIDTH 80; SET PAGELENGTH 1; SET ECHOREQ OFF; SET HEADING ''; SET SEPARATOR ''; SET SIDETITLES ON; SET FORMAT OFF; SET TITLEDASHES OFF; IMPORT DDNAME=IN1; REPEAT * USING PDT (CHAR(19)) IMPORT DDNAME=IN2; REPEAT 1 USING CMP (CHAR(03)) SELECT * FROM table1 WHERE FIN_TRANS_PST_TSTP > :PDT AND SRC_CO = :CMP; IF ACTIVITYCOUNT > 0 THEN .QUIT 2; LOGOFF; QUIT; This is the code tat i use to import data for the two fields PDT and CMP and i want to use it to compare it with the fields taken from the table. i have made sure that the format of the data remains the same (both from the table and from the file). But i get the following errorsFailure 3706 Syntax error: expected something between ')' and '.'. Statement# 2, Info =53 Input row number = 1 *** Total elapsed time was 0.00 seconds. -----+---------+---------+---------+---------+---------+---------+-------+---------+---------+---------+---------+---------+---------+-- Starting Row 1 at 05:09:03 on Tue Dec 16, 2008 Failure 3857 Cannot use value (or macro parameter) to match 'PDT'. Statement# 1, Info =0 Input row number = 2 *** Total elapsed time was 0.01 seconds. Warning: Repeat is cancelled. Please help me in this. Also i m trying to get a return code of 2 if the query returns an answer row, so that i can use tat code in the next step for aborting the job.

we are facing weired issue while invoking BTEQ from UNIX....we have bunch od BTEQ's in shell script and they run one after another...it runs fine and suddenly for bunch of BTEQ's log files come up as empty.

Friends, We have recently got Windows Vista PCs. We have loaded TD12 TTUs. We are able to establish connection to Teradata Servers.But it is not writing history.Please guide.Thanks.Hrishikesh

Have seen some similar mentions using OREPLACE function but need help getting OREPLACE to work on '1A' values, if possible.For example with the following test data (output using the TRANSLATE_CHK function):90210|testdata

Hi,I have this bteq, omitted the logon and sql info at the start,and the RUN_FILE does have some statements that will cause failures:a) .export reset .run file=${RUN_FILE} .if errorlevel <>0 then .quit 3 .quit 0EOF1RC=$?On running, it returns+---------+---------+---------+---------+---------+---------+---------+---- *** Warning: EOF on INPUT stream.+---------+---------+---------+---------+---------+---------+---------+---- .if errorlevel <>0 then .quit 3.quit 3 *** You are now logged off from the DBC. *** Exiting BTEQ... *** RC (return code) = 3 The warning message came because the RUN_FILE didn't end with '.QUIT' or .'EXIT' statement.So i modified the code as below:b) select '.QUIT' (title ''); .export reset .run file=${RUN_FILE} .if errorlevel <>0 then .quit 3 .quit 0EOF1RC=$?On running, i get:.QUIT *** You are now logged off from the DBC. *** Exiting BTEQ... *** RC (return code) = 8 I don't get the warning EOF message anymore, but i don't understand why did the bteq return code change to '8'c) If you add .QUIT or .EXIT at the end of the run_file, is it only meant to say that no more instructions in the run_file,or does it cause bteq to quit the session entirely?Or does Bteq continue with next set of instructions after the .run file command.Thanks,-srinivas yelamanchili

We have just had Teradata Tools and Utilities loaded to our test UNIX (AIX) environment. We can run Fastload, Multiload and FastExport scripts successfully using such command line entries as fastload < script.fastloadbut want to avoid having logons and passwords in the script when we set up jobs for batch running in production.