1550 - 1600 of 2731 tags for Tools

Pages

Any idea why TPT is using more spool than the same query in other tools such a SQL Assistiant?

Hello everyone,

I am new to teradata and I am trying to upload a file using fast load.
When I run the batch file I get the following error in the log file:

Not enough fields in vartext data record number: 1

In the teradata table, I have varchar (100) for all the colums, so not sure what is going on.

My script is the following:

sessions 1;
errlimit 10;
logon DWSANA/username,password;

set record vartext;
begin loading D_CFAEISDB.FTP_profile_fl01 errorfiles d_cfaeisdb.Err1,d_cfaeisdb.Err2;
RECORD 1;
DEFINE FILE=\\is-m-54lxx-fs12\DBP_PLANNING\Fast Load\FTP profile.txt;

In BTEQ
I did .SET LOGONPROMPT OFF
then I checked whether it is really bypassing login username and password.
I tried to login with the following command
.LOGON TDPID/;
but it didn't allow me to login instead it prompted for username and password.

Hi

Seeking configuration assistance for TD SQL Assistant v13.x TO BYPASS CONNECTION PERFORMANCE LIMITATIONS

Hello, I've recently had SQL Assistant 13.0 installed on a new machine and had my data copied including my old history, however when I tried to load the data into the Access DB it looks fine in Access, but SQL Assistant won't show the SQL string - only the rows and run date (I mapped all the fields correctly and checked the field types etc on each table).

Hello,

Just upgraded to SQL Assistant 13 - and I'm getting very frustrated. Normally, I would have a long page of code (2-3k lines) and I need to test as I go along, which means I highlight a section of code, submit and whilst that is running I edit other parts of the code - very productive! v13 seems to not allow this, which is very annoying. I've been through all the options to see how I can change this and there is nothing.

Hi.
Has anybody ever loaded idxformat 8 cobol files with FastLoad?
I have loaded idxformat 4 files succesfully but when reading the idxformat 8 files looks like the inmod doesnt know where a record ends and the other begins.

This is the first time I work with cobol files so any info you can give on this kind of files is appreciated

I can export some fairly large queries in SQL Assistant JAVA edition, but there appears to be a soft limit somewhere around 20 Meg where it just cuts off and creates an empty file. The results appear to be accurate, the export unwilling.

When we have a SQL statement all on one line, it works fine, but when I add line breaks (for readability) or I import SQL with line breaks, it will throw the following:

[Teradata Database] [TeraJDBC 13.00.00.20] [Error 3706] [SQLState 42000] Syntax error: expected something between the beginning of the request and the 'from' keyword.

Hi there,
We are having a problem here at our site for some time and we are wondering if anybody else has had a similiar problem.
The problem is a locking issue we are having where the system deadlocks where the only remedy seems to be a database restart or by a manual job abort.

Here are some instances where it has occured:

-Arcmain acquires locks on dictionary tables triggering deadlock with any user action needing access to the same tables. The system hangs until restarted or one of the jobs holding a lock is aborted.

Hi there,
I'm looking to run the SHOWLOCKS utility at regular intervals.For this I am using the Teradata Manager Scheduler,and have a recorded script called 'CHECK_TEST_LOCKS'.

At the moment I can run it and have the results sent out to a text file:

rcons.exe -S CHECK_TEST_LOCKS -O c:Output.txt

Hi Gurus,

I got an inmod for processing one feed file and eliminate some unwanted rows and then feed the records to fastload. But I am not able to compile it and use it. I have GCC (GNU C Compiler) and TCC (Tiny C Compiler) on my windows macine and I got Fastload utility also installed. I tried to compile the code into a DLL and I succeded in that using below commands.

gcc -c mydll.c
gcc -shared -o mydll.dll mydll.o

But I don't know whats next....

Do I have to compile it in a separate way and Do I need to link it to fastload somewhere?

Hello,

I have a file that contains a Employer record ("M"), followed by multiple Employee records ("B"). There is nothing in the Employee record that ties it to the Employer record other than the fact if follows the Employer record in the text file.

My question:

Is there a way (in Mload, or BTEQ), when I am loading this data when I read the "M" record (employer) I can retain the employer_id and assign it to every "B" (employee) record that follows until I encounter another "M" record? I can do this in SAS and VB, but I would prefer keeping in SQL if possible.

Record Example:

Please advise the privileges needed by TTU13 TQS Client ( sometimes called TDS Client ).

We will install package on Windows 7 with users only having the read only writes privilege.

All is fine after installing SQLAssistant. No problem installation or running the programs.

After installing TQS Client. When running SQLAssistant or TQSClient an error message appears when the program attempts to open a shared memory segment.

If the user is privileged we can set the "run as administrator" and it will work fine.

Hi All,

Can anybody clear on below:

We are trying to load 3 files in 3 different tables within same multiload script.
As a part of impact analysis we have been asked if suppose 1 of file loading failed...will all the insert statements will be rolled back?
Basically we want to know :
1)These 3 inserts will be executed parallely or sequentiallly at RDBMS level?
2)Will these be part of same BT/ET or 3 different BT/ET will be submiited?
3)can we load multiple files into same table within same mload....will that be sequential or parallel....

I am trying to run Tlogview in an MVS job in similar terms as given in TPT Reference guide.

My TPT is running fine, but nothing seems to happen when TLOGVIEW step runs.

Can someone please post an example of how to run it in MVS ?

Can someone help me with all the parameters and options available in tlogview command for reading TPT logs.

Hi All,

Is there a way I can load output from a XML process into Teradata? This is new ground for me, so I have no clue how to even start looking for a solution.

Thanks!

Paul

Can some one please provide a sample script?i tried in many ways but couldnt get it.Your help is really appreciated.

TPT13 installed on windows 2003 x64. I installed both x32 and x64 TPT13.
After installation, the env variable pointed to x64 version.
====================
COPLIB=C:\Program Files\Teradata\Client\13.0\CLIv2\
DataConnectorLibPath=C:\Program Files (x86)\Teradata\Client\13.0\bin
TWB_ROOT=C:\Program Files\Teradata\Client\13.0\Teradata Parallel Transporter
_MSGCATLOCATION=C:\Program Files\Teradata\Client\13.0\Teradata Parallel Transporter\msg64
====================

I changed TWB_ROOT, COPLIB and _MSGCATLOCATION to x86 TPT.
====================

Hi All,

I am encountering a strange problem with my Mload script. I have specified an ERRLIMIT of 1 record in mload script to limit the number of records being rejected into ET table and thus making the script fail.

ERRLIMIT 1 /* Should make the job fail if more than 1 record is rejected */

When i run the script all the records go into ET table but still MLOAD runs with a RC=0. ET table has error 2666 for PKG_ESTB_DT field . Now I know 2666 error and can resolve the issue but i am worried as to why it is returning RC=0 even after setting ERRLIMIT to 1 record?

Appreciate any help.

Hi,

Please can someone send a sample script to export the columnname as the header row using FastExport / TPT Export.?
I am able to do this using Bteq Export, but facing difficulty with FastExport/TPT Export.

Regards,
Mtlrsk.

Hi Gurus,
I'm quite new to Teradata. I need to provide a real time solution that will be able to load about 10+ billion records a day into a table with about 150 columns, while users are able to query the table.
We are currently able to do about 7-8 billion a day using a different database. And we need to do 10+ so that is why we are investigating Teradata.

The data is received in text files that come in through as the day goes by.
We need to load the files as they come in.

Everyone says there are 5 phases in MLOAD and 2 phases in FLOAD
I am little bit puzzled
As FLOAD will also do the below steps
1. logon ,session ,logtable, UV table ET table creations as Preliminary phase in MLOAD
2. syntax checking and sending the SQL to the TD server as DML translation phase in MLOAD

Hi All,

I am new to teradata and working putting image files in teradata.I have successfully being able to do it using sql assistant.I was trying to check if i can do the same using BTEQ.I had been struggling with it and need your help and suggestions....

Here is my table which i created:
create table ALL_WKSCRATCHPAD_DB.devX_lob (
id varchar (80),
mime_type varchar (80),
binary_lob binary large object
)unique primary index (id);

The bteq file which i am using luks like ths:

.set width 132
.logon usrname,pwd

delete from ALL_WKSCRATCHPAD_DB.devX_lob all;

will the fload and mload take same time to load an empty table,wl there be any appreciable difference in load timing.

Regards
Sreehari

When using SAS/ACCESS to Teradata on DELL/XP with TTU7.5, the following
error may be encountered:

ERROR: Teradata connection: CLI2: ICULOADLIBERR(527): Error loading ICU libraries. .

Hi,

I am using the bteq import export option,I have followed the following steps in order:

Step 1
------
export data from table to file

.set width 900
.export data file=myoutput.txt
Select item_id from
db.tab1
sample 10;

.export rest;

Step 2
------
creating a new table

create table db.tab2
(item_id integer)
;

Step 3
-------
import data from file to table

.import data file=myoutput.txt,skip=3
.quiet on
.repeat *
using item_id(integer)
insert into db.tab2
(item_id
)
values
(
:item_id
);
.quit;

Hi,

I just installed version 13 and found that I can copy the answerset and paste into a text file, but once I hover over an Excel file the clipboard reverts to the previous contents. Has anyone come across that one?

Thanks,
David

I am using Fast export to copy a Teradata table to a flat file on a windows server.

.logtable database.logtable_tablename;
.logon terdata_box/user_id,password;
.begin export sessions 2;
.export outfile myoutputfile.txt RECORD MODE FORMAT TEXT;
select *
from databse.table
where date_column ge cast ('01/01/2010' as date format 'dd/mm/yyyy')
and key_column eq 999999999999
;
.end export;
.logoff;

The query runs fine in sql assistant producing a readable text file. When I run fexp.exe the output file has a lot of unreadable rubbish in it.

Hi all,

while mloading single table with multiple file,does teradata load these files parallely or one after the other.

Regards

Does fastload/fastexport has windows 64-bit version?

Experts,
I am looking for Solution which can synchronize/replicate data between two Teradata Production boxes. i know there are couple of solution like Teradata Mover,NPARC,Golden Gate.

i would like to know if Teradata Parallel Transporter can achieve data replication/synchronization?
Objective is to move Full/Partial data between two Teradata boxes.
Is there any limitiations on TPT like Can we move Journals,Indexes(JOIN/HASH),STATs using TPT?
And how the performance of TPT is like how do you achieve the SLA of and hour?

hi all,
i am trying for a fastload inside a unix script,where in the table being inserted have six columns ,out of which three columns gets their data from flatfile,while the other three are date columns which have to be manipulated based on the date which comes along with the flat file name.

If you are going to use the teradata utilities ,will it change the Architecture or SDLC?
If yes then why and how?

Hi,
I tried to use the Visual Explain tool V 13 but get the following errors:

[Teradata][ODBC Teradata Driver][Treadata Database] Internal error: Please do not resubmit the last request. SubCode, CrashCode: 0, 3701.

I inserted a query contains complex query with join, and group by clauses using "Insert Explain" option in the tool. I believe the QCD database should not be the problem, because the tool worked fine with simpler query. (No group by just simple join.)

Anyone has seen this problem before?

ps: I am using the Teradata Express DB 13 Linux VM as my database server.

Teradata SQL Assistant 12 ,I can Save all sheet to one excel file.

but Teradata SQL Assistant 13 can't, why?

SQL Assistant 13.0 has a menu button to suppress inclusion of a column of row numbers as headers in copy-pasting or exporting answersets. If you switch this feature off using the button it resets itself to the default of including the row headers every time a new answerset opens.

I have been doing teradata export and import through named pipe until i used tpt. But now after i learnt how to write a tpt, i have been using import from export operator in tpt.

One of my users is experiencing a problem while attempting to connect to our development environment via SQL Assistant. SQL Assistant opens up, and he clicks on the 'CONNECT' button. He then selects his datasource and the logon screen pops up. Once he enters his logon information and selects 'OKAY' the entire thing disappears...

Hi,

I have a ERROR when I run BTEQ with QUERY_BAND option
Is supported by BTEQ QUERY_BAND ???

BTEQ 12.00.00.09 Tue Jun 29 14:14:41 2010

+---------+---------+---------+---------+---------+---------+---------+----
.logon DWP/DBA,

*** Logon successfully completed.
*** Teradata Database Release is 12.00.02.35
*** Teradata Database Version is 12.00.02.36
*** Transaction Semantics are BTET.
*** Character Set Name is 'ASCII'.

*** Total elapsed time was 18 seconds.

+---------+---------+---------+---------+---------+---------+---------+----

Below is a SAS script that passes variables thru the same code many times. I would like to do the same type of thing in Teradata with my initial code looking like this for an UPDATE.

1. First, this is an UPDATE statement
2. I need to observe the current month's and prior month's MISS_PMT_COUNT over years to determine the MOST RECENT TIME when a mortgage account went from delinquent to cured and vise versa.
3. If there is some way to determine &N observations on the MIN(STATEMENT_DATE) for the pass through values, that would be great.

Thank you in advance,

Below is a SAS script that passes variables thru the same code many times. I would like to do the same type of thing in Teradata with my initial code looking like this for an UPDATE.

1. First, this is an UPDATE statement
2. I need to observe the current month's and prior month's MISS_PMT_COUNT over years to determine the MOST RECENT TIME when a mortgage account went from delinquent to cured and vise versa.
3. If there is some way to determine &N observations on the MIN(STATEMENT_DATE) for the pass through values, that would be great.

Thank you in advance,

Product ID: B035-3119-088K
Product Title: TERADATA TOOLS AND UTILITIES 13.00.00 SUPPORTED PLATFORMS AND PRODUCT VERSIONS
Date Added to Web Site: May 27, 2010
http://www.info.teradata.com/eDownload.cfm?itemid=082340004&sitenm=internal

I am getting this error while running the mload:
02:42:25 UTY2403 An invalid statement was found before the .LOGON statement.
02:42:25 UTY2410 Total processor time used = '0 Seconds'

Please help me!!!!

I have 2 BTEQs having following type of Statements. Both are submitted at the same time and both are trying to insert in the same table.

BTEQ1
---------------------------------
BT;
INSERT INTO T1 SEL * FROM GTT;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
ET;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
.QUIT
.LABEL LABELS
.QUIT ERRORCODE

BTEQ2
---------------------------------
BT;
INSERT INTO T1 SEL * FROM GTT;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
ET;
.IF ERRORCODE <> 0 THEN .GOTO LABELS
.QUIT
.LABEL LABELS
.QUIT ERRORCODE

-----------------------------

What affect do Multiload have on performance. Do MlOAD take up all resources and leave all other utilities like BTEQ and SQL Assistant with decreased performance?

I am using fast export batch mode on windows. I want to schedule fast export using batch file.
I want to know how do I include the currentdate in the export file name.

.BEGIN EXPORT SESSIONS 8;
.export outfile "Export.out"
MODE RECORD FORMAT TEXT;

It works perfect with x86 odbc driver, but failed with x8664 odbc driver.

Does anyone has suggestion on this issue? I had installed tdicu, teragss and td odbc driver.

Error message is 'Specified driver could not be loaded due to system error 126 (Teradata)'

Can anyone help me in trying to load a pipe delimited text file with headers and footers?

I have been trying to load the file in one column VARCHAR(300) to then use some code to search for each | and extract each field in turn.

I am using BTEQ import as file is small but having trouble with loading - error message = *** Failure 3857 Cannot use value (or macro parameter) to match 'DUMMY_FIEL
D'.

script looks like:

DATABASE U_CSMIDB;

.if errorcode !=0 then .quit errorcode

CREATE TABLE WT4204_WBS_PROJECT_LOAD
(
DUMMY_FIELD VARCHAR(300)
)