1550 - 1600 of 2568 tags for Tools

Can someone explain this, since I don't understand what is happening. My MLoad runs to a good completion code. However, my question is about the number of records read in, compared to the number inserted. Records read = Records that satisfy the conditions. However, the records inserted does not equal these numbers. Why? There are no UV or ET records written, so I am confused why the number inserted is not the same as the records that satisfied conditions.Here are the stats from the log file: . IMPORT 1 Total thus far . ========= ============== Candidate records considered:........4662284....... 4662284 Apply conditions satisfied:.......... 4662284....... 4662284 Candidate records not applied:....... 0....... 0 Candidate records rejected:.......... 0....... 0----------------------------------------------------------------------------------------------- Inserts: 4624210 Updates: 0 Deletes: 0 ============== ======================================================== 0 TDWWorkTbls.ET_IMFStage 0 TDWWorkTbls.UV_IMFStageThanks!Paul

My question is posted at:


To repeat:

Hello All,I have a query that has 11 steps in it where it creates several volatile tables in Teradata SQL Assistant. For some reason it stops after the fourth step and gives no error message, etc. I can then highlight and run the additional steps without problem. I need it to run straight through. Any ideas why this may happen?Alan

Hi,I would like to know if there is any way to load csv files with mload when they contain quotations. Excel places quotations marks around cells that have "" commas as cell value.I know in Oracle there is and Optional tag in the control file to indicate data file may also be enclosed in double quotes.

I have to Update only one field in the table with one valuea using MultiLoad (dont want to use BTEQ since DBA dislike to update table with >100k rows with BTEQ). For instance, SQL is following...Update DB.TABLE1SET FIELD1=703WHERE FIELD1=1;Now, i have done updates usnig field input, but never with one colum value.

We have a scheduled Teradata job runs everyday and works fine. Recently, this job failed with error code 3524, the user does not have CREATE TABLE access to database XXX ( our database name). I know that is not correct since it has run a while and nothing has been changed.

We have installed Teradata on Windows machine. Now on running Database Setup and providing the following information (in the WIndows GUI):
Database Name : db1
Super User: DBC
Super User Password: dbc
Console Password: dbc
Set up DB for Teradata Manager: Yes
Perm Space: 100 MB
Spool Space: Same as Parent

Create a Teradata Manager User: Yes
User Account String: $H-DBC-MANAGER
User Password: dbcmngr
Perm Space: 100MB
Spool Space: Same as Parent
Give user privilages to run: Selected all options in combo
Migrate TDQM Database: Yes

Hi everyone,Can anyone explain to me, What is the use of Default value & compress value in the table(DDL)? If we are not using Default & compress value in the DDL, how it would be in the performace wise? Thanks in advance !!! Thanks,Guru

HiIn UNIX, I have fast exported 10 rows from source table(2 columns--item_id, site_id) to dw_abc(text format). For eg: .export outfile dw_abc format text; ** this is the syntax i have used.-------10 rows got exported, but the problem is when i was opened the output file "dw_abc" in UNIX. It was showing like this.item_id site_id100 200300 400500 600700800 ------------- item_id, site_id were not separated900 1000101201-------------item_id, site_id were not separated301 401501601-------------item_id, site_id were not separated701 801901 1001Some of the item_id, site_id got separted by tab as a delimiter. Some item_id, site_id's were not separted. I want, all the item_id, site_id should be separated by tab as a delimiter. Can anyone explain?Thanks for ur support!!!Thanks,Guru

Hello all,Having just started working on MLOAD, I stumbled on this problem :When charging a file, the tool finds the correct number of lines.The tools then proceeds to put all those lines in the [name of table]_UV table.When I then make a sel * on this table, I can see that every field of every line has a value which is '?'I can understand that there is

Our shop is having difficulty with our NetVault Backup failing due to what seems to be our alert policy through TASM killing "idle" sessions of the backup.Shortly after our upgrade to TD we implemented a policy through Teradata Alert Policy Editor to abort idle sessions after 1 hour. We then noticed that our weekly NetVault backup was failing due to a "DBMS Restart or Session Reset". We tried adding the 2 NetVault users to the Exception list in the Sessions tab of Alter Policy Editor, but the backup still failed with the same restart/reset error as before.Last Friday afternoon we modified the policy to email instead of abort after 1 hr idle time. The backup was successful but we never got an email stating that the NetVault user had exceeded the 1 hour threshold.Is it possible that the policy is aborting some sort of internal session that is used by NetVault but is not recognized as a valid user to send an email about? We have another policy that emails our DBAs if a session is blocked and it DOES email when -internal session- blocks another user's session. This is what is so confusing to me.Is there something I am missing?Thank you,Frustrated

Is there a way to copy STATS collected on a table to a duplicate table?We have a process which we refer to as CLONE TABLE Processing. Basically when we are required to modify a large table rather than performing an ALTER TABLE we create a new table with the new attributes required and perform a FASTLOAD or INSERT INTO. The process to duration to make the changes this way out performs ALTER TABLE for large tables. the problem is collecting stats takes a long time. We are also noticing that table sizes increase dramatically if we alter table vs performing clone table processing. In a recent example we increased a table size by 100GB by adding a single attribute using alter VS CLONE... very strange.

Hello support team members, I need your help. in the DBC.DBQLogtbl there is a field named APPID (The application ID under which the query is submitted) does anyone know what UVSH stands for? Example you also find BTEQ, SAS, JDBC, TPUMPEXE, TSCOM3.OUT these are known applications id’s

Hi all,

The 'Expired' status can be found in the Status column of the TD QS Viewer application. When I only had a handful of jobs, I used to just drop and recreate a scheduled request to get rid of the 'Expired' status. (Yes, it may be primitive, but it worked for me then). But now, I have a whole system of jobs and they eventually expire after some time. It just doesn't make sense for me to reset them manually anymore. I want the jobs to stop expiring and just let them run continuously unless I manually stop them.

Hi,I need to move a user along with the objects from V2R5 to TD13.I was able to archive it.But while copying it to TD13 it errors out saying unsupported rdbms.Isnt that possible to migrate the data from v2r5 to TD13.Please help.

I am trying to use the TDGeoImport utility to load some data. I can not get it to run under either Vista or XP.When using Vista I get and EXCEPTION_ACCESS_VIOLATION (0xc0000005) error from the JVM. Most suggestions regarding these errors suggest either the JVM or application have bugs or there is a security error. I have tried different directory permissions, different JVMs, disabling UAC - still no luck.Vista Log[font=Courier New]C:\Users\williamh\Documents\install\teradata\TdGeoImportExport\bin>"C:\Program Files\Java\jre6\bin\java" -classpath .;"C:\Users\williamh\Documents\install\teradata\TdGeoImportExport\bin\terajdbc4.jar";"C:\Users\williamh\Documents\install\teradata\TdGeoImportExport\bin\tdgssconfig.jar";. com.teradata.geo.TDGeoImport -l,xxxxxx -s wsadmin -f C:\Users\williamhLogon =,xxxxxxDatabaseName = wsadminData Source = C:\Users\williamhconnecting to ... connected!-------------------------------------------importing layer Branch_mif to table BRANCH_MIF ...## A fatal error has been detected by the Java Runtime Environment:## EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x77d66739, pid=2628, tid=5644## JRE version: 6.0_15-b03# Java VM: Java HotSpot(TM) Client VM (14.1-b02 mixed mode, sharing windows-x86 )# Problematic frame:# C [ntdll.dll+0x66739]## An error report file with more information is saved as:# C:\Users\williamh\Documents\install\teradata\TdGeoImportExport\bin\hs_err_pid2628.log## If you would like to submit a bug report, please visit:# http://java.sun.com/webapps/bugreport/crash.jsp#[/font]Under XP I do not even get that far. I seem to have a PATH problem or be missing a DDL. Any ideas?XP Log[font=Courier New]C:\customer_data\xxxxxxxxxxx\TdGeoImportExport\bin>java -classpath .;terajdbc4.jar;tdgssconfig.jar;. com.teradata.geo.TDGeoImport -l,xxxxx -s wsadmin -f C:\Users\williamhjava.lang.UnsatisfiedLinkError: C:\customer_data\xxxxxxxx\TdGeoImportExport\bin\geojni.dll: This application has failed to start because the application configuration is incorrect. Reinstalling the application may fix this problem at java.lang.ClassLoader$NativeLibrary.load(Native Method) at java.lang.ClassLoader.loadLibrary0(Unknown Source) at java.lang.ClassLoader.loadLibrary(Unknown Source) at java.lang.Runtime.loadLibrary0(Unknown Source) at java.lang.System.loadLibrary(Unknown Source) at com.teradata.geo.TDGeoImport.main(TDGeoImport.java:869)===========================================Data importing failed![/font]

Hi. What is the need of collect statistics in teradata? Can anyone explain with suitable examples? Also, How can we choose the collect statistics columns?

I can not install Teradata Sql Assistant in windows7Is anybody give me more info?

I have notes field VARCHAR(72) that has some linefeed/tab characters. Is there an easy way to strip them off. I tried replace(col1,'OA'XC,' ') but it didn't like it.ThanksAnil

Hi All,I need an answer to this question urgently.can we provide database name to TDPID parameter in connection manager?Please reply ASAP. Thanks in advance!!

HI,After instlalling the tools and utilities onto a WinXpSp2 desktop, Bteq gives and error the CAPUTL.DLL was not found.I've reinstalled and rebooted severaltimes, but the issue remains.Any suggestions ?

Hi friends,I need some real world examples of fastload scripts using inmod - Unix/ C. The fastload need to treat special character in the source data file before insert in the table. Can anyone help me with this?Thanks a lot!Patricia

Hi,Please help.How we can validate the Text file Data in teradata?ex:I have 100 records in source in loaded usng fast load .i got only 90 records in target.tell me the falt file validation in teradataCan you please help on this

Hi All!!We have our teradata database on UNIX server. And recently we had teradata version upgrade from 6 to 7.I am working on a Teradata fastload script that worked correctly with out any error. But giving an error when executed in new version.The fastload script is as follows:.logon USER_ID/PASS_WD,.LOGTABLE ERR_DB.TABLE_NAME ;DROP TABLE ERR_DB.UV_table_name ;DROP TABLE ERR_DB.ET_table_name ;DELETE FROM DB.table_name ;set record unformatted;define emp_id (CHAR(12) ), org_name (CHAR(50) , NULLIF = '*'), prv_zip (CHAR(5) , NULLIF = '*'), last_nm (CHAR(40) , NULLIF = '*'), frst_nm (CHAR(20) , NULLIF = '*'), prof_pfx (CHAR(4) , NULLIF = '*'), address (CHAR(50) , NULLIF = '*'), city_nm (CHAR(50) , NULLIF = '*'), newlinechar (CHAR(1))file=file_name;show;begin loading DB.table_name errorfiles ERR_DB.ET_table_name, ERR_DB.UV_table_namecheckpoint 0 ;insert into DB.table_name ( emp_id, org_name, prv_zip, last_nm, frst_nm, prof_pfx, address, city_nm ) VALUES ( :emp_id, :org_name, :prv_zip, :last_nm, :frst_nm, :prof_pfx, :address, :city_nm) ; end loading;logoff;Encountering the error "syntax error at line 13 : `(' unexpected ".Googled this issue to find exact reason. Found few posts which say the error is due to "OS version".Does any one has something to say about this?Thanks in advance!Sanjay!

Hi Teradata Gurus!I am using Teradata Fastload utility to load data from a file to table. The table is being loaded successfully with all the data in file. But returning the following error: Highest return code encountered = '4'.Kindly Help!!For further information..please follow the loader log details below:**** 18:50:55 Processing starting at: Fri Sep 11 18:50:55 2009 =================================================================== = = = Logon/Connection = = = ===================================================================0001 .logon USER_ID/PASSWD,**** 18:50:55 Teradata Database Release: V2R.**** 18:50:55 Teradata Database Version:**** 18:50:55 Current CLI or RDBMS allows maximum row size: 64K**** 18:50:55 Character set for this job: ASCII**** 18:53:09 CLI Error 301: CLI2: SESSOVER(301): Exceeded max number of sessions allowed.**** 18:53:12 Number of FastLoad sessions connected = 298**** 18:53:12 FDL4808 LOGON successful0002 DROP TABLE utlty_c.Table1 ;**** 18:53:12 RDBMS error 3807: Object 'utlty_c.UV_Table' does not exist.0003 DROP TABLE utlty_c.ET_Table ;**** 18:53:12 RDBMS error 3807: Object 'utlty_c.ETTable' does not exist.0004 DELETE FROM table_name ;**** 18:53:15 Command completed successfully0005 set record unformatted;**** 18:53:15 Now set to read 'UNFORMATTED' records**** 18:53:15 Command completed successfully0006 define File Lay_out file=file_name; **** 18:53:15 FDL4803 DEFINE statement processed0007 show; FILE = file_name description of field_name1 description of field_name2... TOTAL RECORD LENGTH = 3450008 begin loading table_name errorfiles utlty_c.ET_Table, utility.UV_table checkpoint 0 ;**** 18:53:33 Number of AMPs available: 384**** 18:53:33 BEGIN LOADING COMPLETE =================================================================== = = = Insert Phase = = = ===================================================================0009 insert into table_name ( field_names ) values ( ) ;**** 18:53:37 Number of recs/msg: 182**** 18:53:37 Starting to send to RDBMS with record 1**** 18:53:38 Starting row 100000**** 18:53:38 Starting row 200000**** 18:53:38 Starting row 300000**** 18:53:39 Starting row 400000**** 18:53:39 Starting row 500000**** 18:53:40 Starting row 600000**** 18:53:40 Starting row 700000**** 18:53:40 Starting row 800000**** 18:53:41 Starting row 900000**** 18:53:41 Starting row 1000000**** 18:53:42 Starting row 1100000**** 18:53:42 Starting row 1200000**** 18:53:43 Starting row 1300000**** 18:53:43 Starting row 1400000**** 18:53:43 Starting row 1500000**** 18:53:44 Starting row 1600000**** 18:53:44 Sending row 1668271**** 18:53:48 Finished sending rows to the RDBMS =================================================================== = = = End Loading Phase = = = ===================================================================0010 end loading;**** 18:53:51 END LOADING COMPLETE Total Records Read = 1668271 Total Error Table 1 = 0 ---- Table has been dropped Total Error Table 2 = 0 ---- Table has been dropped Total Inserts Applied = 1668271 Total Duplicate Rows = 0 Start: Fri Sep 11 18:53:49 2009 End : Fri Sep 11 18:53:51 20090011 logoff; =================================================================== = = = Logoff/Disconnect = = = ===================================================================**** 18:53:58 Logging off all sessions**** 18:54:38 Total processor time used = '11.2335 Seconds' . Start : Fri Sep 11 18:50:55 2009 . End : Fri Sep 11 18:54:38 2009 . Highest return code encountered = '4'.**** 18:54:38 FDL4818 FastLoad Terminated

Hi FriendsCan we load filler concept in fastload, i.e., i want to select only specified columns in the fastload, can i do the job?ThanksKIRAN KUMAR


I left this post on the main Teradata website forum, but I thought this might be more appropriate and perhaps get a response. Thanks for taking a look. I just found this little nook inside of Teradata and I'm glad it exists.

I've created a Windows named pipe in a vb.net application using the System.IO.Pipes namespace. I'm creating a pipe, my process appears to be connecting to the pipe block and waiting to read and now I would just like to feed the pipe from Teradata.

HiI am loading the data into files using fastexport but the data is in binary format eventhough i am using text format, please send me a sample script to load the data into tables in a text format.Thankskiran

Hi FriendsI am unable to load the data when inserting nulls or strings in the table using QUERY MAN.insert into fact_current values(id, name, ,"expected",salary);Wat is the problem?Please tell me the solution.KIRANkiran_teradata@ymail.com

How do I verify what version of software is loaded to our client? Here is what I need to verify: piom, tdicu, cliv2, teragss.I don't have the installation CD, and I need to validate the version.Thanks!Paul

Hello:I've created a Windows named pipe. My process is connection to the pipe and is ready to read it. I now need FastExport to write to the pipe. Is this possible without using FastLoad or the Named Pipe Access Module?I'm looking to do something like this....EXPORT OUTFILE \\.\pipe\testpipeFORMAT TEXT MODE RECORD;But this error tells me Fexp is looking for an actual file name like C:\test.txt.**** 17:35:21 UTY4019 Access module error '34' received during 'File open' operation: 'pmUnxDskOpen: fopen error (Invalid argument)' File Name : '\\.\pipe\testpipe'.It is possible that I've not configured my pipe correctly. So if there is any one out there with any experience in this matter I need some help. Thanks!

Hi ALL,While loading data from flat file to Teradata table using MLOAD,few records has bad data.Due to these records MLOAD got failed and this data hasn't been captured in ET tables.So I have used "DISPLAY ERRORS NOSTOP" command to make sure that MLOAD get completes without fail.But I want to write such bad records into a file for furthur investigation.Ca

Hi ALL,While loading data from flat file to Teradata table using MLOAD,few records has bad data.Due to these records MLOAD got failed and this data hasn't been captured in ET tables.So I have used "DISPLAY ERRORS NOSTOP" command to make sure that MLOAD get completes without fail.But I want to write such bad records into a file for furthur investigation.Ca

I have a table coming in from Sybase using BCP. It is pipe delimited. There is a Date field I will call RecordDate. It is Varchar (30) but it needs to be a DATE so that it can be compared to another table using the date and some other fields. I tried several ways to format the fastload program to bring in and format a DATE instead of a VARCHAR(30) but the program will only work if I bring it in as a varchar (30). I looked at the input file and it is pipe delimited, but in the varchar(30) field, I only see something like this: 12/31/2008 I am a supervisor and have been hands off for awhile, but we are short handed and I'd like to get this working. Any suggestions? Please? Thanks.

Can someone guide me in installing teradata Geospatial software?

I have the following setup

1.Teradata database version 12.0 running on Linux server
2.Teradata Tools and utilities Version T13 running on Windows XP

We are calling Tpump from Informatica to load a flat file. Though no of records of low enough, still it is taking too long a time to load data. Here is list giving details of no of rows loaded and time taken. Even for as loas as 1707 rows it's taking over 13 mins and then another time for 1468 rows it's taking 12 sec only. Could there be anything I can do to resolve this. We are using 8 session with pack of 20.No of rows Time Taken1468 12 sec3567 21:10min1188 15sec2123 6:44m298 19sec137 13sec1707 13:48m26045 25:27min25266 18:23min6552 39:34minPrahlad

Can I able to Load data in Views using Fastload ???As if I've tried to load the data in a view using Fastload - it's throwing error!!!But loading data in a view using Multiload is working fine (with single session / multiple sessions) - it's inserting data in Views.What's the reason behind this - I mean why can't we load data in Views using Fastl

Hi,I am using the following .accept command in MLOAD to accept a system variable..LOGTABLE ****_WORK.LT_*****MLOAD ; .RUN FILE LOGON; .SET SRC TO '&SYSDAY'; .ACCEPT TESTC FROM ENVIRONMENT VARIABLE SYSDAY; But the mload is failing with "UTY4827 The requested environment variable S

Dear All, I am new to Teradata. We just started forklifting DB2 UDB stuff to Teradata. As I am new to Teradata , I would like to have the samples that helps me from converting the existing to this proposed using Teradata. Please post some experiences if you have already migrated / forklifted the db2 stuff to Teradata.

Hi,I run BTEQ steps on mainfarmes by using a dataset containing BTEQ statements. Whenever i need to analyze the the statements, i find it difficult to distinguish valid statements from commented-out statements as they all look the same.

I am working on mainframe environment,in which i am using a file which contains special characters like Société Generale which will be loaded into the column which is defined as CHAR(30)If i am loading this with MLOAD utility,job is not failing but the values are not loading properly(the value is loading as "Soci‘t‘ Generale").I am using the file which is "~" separated.Please let me know,how to resolve this problem.Note: The file contains normal records(without special characters) as well as special charaters related records.Channel attached defaultly using EBCDIC charcterset.I tried using UTF8,but layout its not taking properly(i am getting some error saying length is not matching).

I'm trying to export the result from an SQL query through Queryman (SQL Assistant) [version 7.2, TD # V2R6.2].Whenever I'm going to run the query - instead of asking the path where to save the resultset -it's throwing two errors sequentially yeilding no answerset.Please check the two errors ANNEXED with post...Please help to fix the #### errors #### as

HiMy situation is we need to load a million of data from SQL Server to Teradata, But we should use TPUMP for itMy Question is does TPUMP supports database as input to it ?, if yes plz explainThanks in advance

I have asked this before, but have never received an answer. So, I will try again.Our production load jobs are abending with this message:**** 23:13:34 UTY4015 Access module error '34' received during 'read' operation on record number '5789498': 'pmunxRBuf: fread error (Invalid argument)'What is causing this to occur? When I restart the job, it will sometime complete successfully, with no changes being made, or it abends with a different record being flagged. This is large file 6+ million recs, but this error seems to randomly appear. Is it space related? Server space issues? Is this a Teradata issue?What can I do to prevent this error from happening in the future?Thanks!Paul

We want to loading the large object into the blob column with the command line loading tool.

I guess it can be loaded with Java applications. But with bteq, fastload or multiload, how to load it.

I've data which has delimiter as comma(,) and each record is enclosed by double quotes " " as below format (2 rows given)"000002BF1A4ED13C6D2AC92F55A251A1","13 ","~DV ","01/01/1111 00:00:00","10/12/2001""0000203BBFD112DCD4C4F4673FA766E1","13 ","~DV ","01/01/1111 00:00:00","08/21/2006" I want to use Fastload to load thes

Hello all, I am having trouble understanding difference select * from TABLE X or select a , b, c ( columns) from table X ( same table ) above.I was wondering if someone actually knows what is the difference at the Teradata DB Architecture level.Thanks,Aditi

I regularly use DATA format to produce ouput files from BTEQ running in TSO on an IBM mainframe.I'm trying to do something similar using BTEQ running on Windows.I'm finding that the only output type that appears to work well is the REPORT format.DATA seems to introduce odd characters, wrap lines, etc.REPORT format is almost fine for me, but I really do NOT want the column headings.I've read the manual, found that I can turn off the dashes under the column heading, but cannot find a way of getting rid of the column heading itself. Is it possible?If not, can anybody point me to some advice on how to make the DATA output type work better in Windows?Thanks,T

Please give me more information about the difference between JDBC Driver Release 13.00.00 and JDBC Driver Release 12.00.00 , the news of JDBC Driver Release 13.00.00.Thanks

I have the following mload script where I need to assign a default value for JOB_TTL_NM in .dml. Can any body tell me how this can be done. I tried putting the default value in different formats like 'Non-Associate' , "Non-Associate":JOB_TTL_NM = 'Non-Associate'. But, getting error....logtable UD131.ABC_LT;.logon ABC/XXX,XYZ;.begin mload tables UD131.ABCworktables UD131.ABC_WTerrortables UD131.ABC_ET UD131.ABC_UVsessions 10sleep 10tenacity 4errlimit 1;.layout mylayout;.field ENT_USER_ID * VARCHAR(6);.field EMPL_LST_NM * VARCHAR(80); .field EMPL_FRST_NM * VARCHAR(80);.field EMAIL_ADR * VARCHAR(180);.field LEVEL_DESC * VARCHAR(80);.field JOB_TTL_NM * VARCHAR(50);.field MGR_LST_NM * VARCHAR(80); .field MGR_FRST_NM * VARCHAR(80); ; .dml label mydml;insert into UD131.ABC(ENT_USER_ID , EMPL_LST_NM, EMPL_FRST_NM,EMAIL_ADR, LEVEL_DESC, JOB_TTL_NM, MGR_LST_NM, MGR_FRST_NM, )values(:ENT_USER_ID , :EMPL_LST_NM, :EMPL_FRST_NM, :EMAIL_ADR, :LEVEL_DESC, :JOB_TTL_NM = 'Non-Associate', :MGR_LST_NM, :MGR_FRST_NM, );.import infile myfile.csvformat vartext ',' layout mylayout apply mydml;.end mload;.LOGOFF;