1550 - 1600 of 2675 tags for Tools

Pages

I tried out the Macro possibilities inside SQL Assistant. Unfortunately I saved it to the Letter "A". So now the Macro is run everey time I hit this character... Does anyone know how to delete a recorded Macro inside SQL Assistant?

Windows 7 64-bit
Teradata v12
SQLA Java 64-bit

I am trying to export a view that contains about 3500 rows, but I am having a hard exporting the entire result set outside of what is being displayed.

1.
Currently SQLA only displays 500 rows, which is OK. When I select export all to excel, I was expecting to see all 3500 entries, but instead I end up with the 500 displayed rows only. How can I export all rows?

2.
I have tried modifying the display result to 5000 instead of default of 500, but SQLA still displays 500 only. Is this a bug in the tool or am I doing something wrong?

Hi,erveryone
I have a problem in using TPT. When i use the Oracle or informix ODBC operator in my a teradata PT job i received a ODBC connect error
or ODBC driver error.

The following is my machine description:
1: Redhat linux 2.6.9
2: Software:
Informix CSDK 2.9
Oracle database 10g
Freetds0.82
TTU 13.00.00.2

perl 5.8.8
DBI 1.608
DBD-ODBC-1.23
unixODBC-2.2.14-p2

The following is my test step:
1: When i set $ODBCHOME=/opt/teradata/client/13.0/odbc_32 and $ODBCINI=/opt/teradata/client/13.0/odbc_32/odbc.ini,

Hi,

I am using mload to delete the data from a 10 TB table.
The table is partitioned based on date.
The table is having NUSI on two columns.

When we dropped the 20 partition without drop the NUSI. it taken 2 days.

Hi All,

When running the TPT scripts for loading a 12GB files into loaded table, scripts faing with below error message, where as if same scripts ran sucessfully with few 10000 records.

Teradata Parallel Transporter Version 12.00.00.00
Execution Plan generation started.
Execution Plan generation successfully completed.
Job log: /usr/tbuild/12.00.00.00/logs/test_load-1.out
Job id is test_load-1, running on a01faif1a
Teradata Parallel Transporter DataConnector Version 12.00.00.00
FILE_READER: Instance 1 directing private log report to 'FILE_READER_LOG-1'.

Hi All

If anyone has a script generator that will generate mload, fload, or fexp empty framework templates, please do send it to me to chiram1.etl@gmail.com

Thanks
Raghavender

Hi all,

I'm new to Fast Load and I'm looking for some expert advise. I've tried to load data into my table using CSV file as a FLAT FILE, but I keep getting the following errors ;

Number of rec/msg:7140
Start to send to RDBMS with record 1
RECORD is too long by 21311 bytes
Field 1: 168641609

Full scripts as follows;

sessions 8;
errlimit 250;

logon cda/userid

define cust_no (decimal(14,0))
file =X:\Use Id\Cust.CSV;

begin loading d_database.table
errorfiles d_database.err1, d_database.err2;

insert into d_database.table values(:cust_no);

All,

I tried installing Teradata Eclipse Plugin on MyEclipse7.5 version and struck with some dependency problems as mention below. I am not sure whether the Teradata Eclipse Plugin works for MYEclipse.
Can anybody assist me in resolving these?

annot complete the install because some dependencies are not satisfiable
Unsatisfied dependency: [com.teradata.datatools.dtpsupportFeature.feature.group 13.1.0.200910281454] requiredCapability: org.eclipse.equinox.p2.iu/org.eclipse.datatools.sqltools.sql.ui/0.0.0

i want to insert some values in my table using INSERT INTO SELECT SQL as:

INSERT INTO tableName (col1, col2, col3)
SELECT col2, col3 FROM tableName;

for col1 i need to pass the value comming from file (using (:next_var)) and increment it for every row comming from the SELECT.
How to do this?

Hi,
Checkpoints is used to resume the paused job, at a point where it was paused, because of the error caused by the client or by the RDBMS. Am i right with the definition.

Hi,
I am trying to load two records from a notepad into a table using fastload utility, its working fine if there

is a single record, but when i add a newlinechar(char(1)), if am getting the below error.

0006 insert into samples.saat3
(:name,
:id);

Number of recs/msgs: 3780
Starting to send to RDBMS with 1 record
Incorrect number of bytes returned from a File Read!
Expected: 12, Received:2

The below are the scripts used:

Records present in the notepad:
raghu 300 |
ram 200 |

Scripts in fastload utility
.logon 127.0.0.1/dbc,dbc;

create table samples.saat3

I have been asked by the server admins to provide a space requirement to install the following Teradata 12 utilities on a SUN Solaris SPARC server:

(1) CLI and related security libraries
(2) BTEQ
(3) FastLoad
(4) MultiLoad
(5) FastExport

Hello,

What would cause a MLoad to abend with this error: 2575: Timed out waiting for lock on
TDWWorkTbls.ET_TARStage

No other job is accessing the table, and all I am trying to do is load data into staging. I have never had this error before on loading.

Any ideas how I can get around this?

Thanks!

Paul

I would like to be able to produce a file by running a command or batch which basically exports a table or view (SELECT * FROM tbl), in text form (default conversions to text for dates, numbers, etc are fine), tab-delimited, with NULLs being converted to empty field (i.e. a NULL colum would have no space between tab characters, with appropriate line termination (CRLF or Windows), preferably also with column headings.

This is the same export I can get in SQL Assistant 12.0, but choosing the export option, using tab delimiter, setting my NULL value to '' and including column headings.

Hi All,

What version of the Teradata utilities (SQL Assistant, Teradata Administrator, ODBC drivers) will work with a laptop that has Windows 7 installed? Where can I find the correct download site for this?

Thanks!

Paul

Teradata experts,

I use Teradata Assistant to view our SQL Server. I recently upgraded to v. 13, and I can no longer see all my tables in Explorer view, though I can still query them. It looks like Explorer View is only allowing a set number of tables (around 40).

Hi, I have a dilemma…I have a table A that has some special characters in a column aa. When inserting into table B, the following SQL gives the untranslatable character errorInsert into B.bb sel CASE WHEN A.aa IS NULL OR A.aa = '' THEN '*' ELSE A.aa END AS A.aa from AThe column DDL is VARCHAR(16) CHARACTER SET LATIN NOT CASESPECIFICIn both that A and B tables.However, If I skip this test and just insert the column A.aa as it is, I can do something like this later:Insert into B.bbsel A.aa from A ;update B.bbset B.bb = '*'where B.bb is null or B.bb = '';

Hi AllI have requriment to test a database is up or down. I selected BTEQ to do my testing and below is my logic. As bteq is continously trying to logon, my script is running continously with out existing. BTEQ Interactive Mode:$$$bteq Teradata BTEQ 12.00.00.00 for UNIX5. Copyright 1984-2007, NCR Corporation. ALL RIGHTS RESERVED. Enter your logon or BTEQ command:.logon 90.11.111.111/abc123.logon 90.11.111.111/abc123Password: *** Warning: DBS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS> *** Warning: DBS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS> *** Warning: DBS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS>assume 90.11.111.111 is valid UNIX Server with NO TERADATA DATABASE because my test will success for valid server and valid database.Could someone help me in identifying the right bteq paramerters.Thanks

Hi All, I have requirement where I have to ­build(generate) an excel sheet from a table but it can have more­ then 2 -3 lakh records and as we all know one excel sheet­ cant have more then 65K+ records so more then 65K records­ should be on next sheet in the same excel sheet and this ­process will be part of job on UNIX Box. using BTEQ­ EXPORT ? or anything else? And second that excel­ sheet should be stored into database table as BLOB data. does­any buddy have exp in java or c lang. bec I am not sure how we­can achieve this. Please share your ideas if anybuddy ­have faced this type of situation. Thanks

Cannot open file: 'file' =================================================================== = = = Logoff/Disconnect = = = ===================================================================**** A:B:C Logging off all sessions**** D:E:F Total processor time used = 'Z Seconds' . Start : X X X X . End : Y Y Y Y . Highest return code encountered = '12'.pLEASE LET ME NOW WHAT SHALL BE REASON,AS PER MY KNOWLWDGE SHOULD BE1) FLOAD DOES NOT HAVE ACCESS TO THAT FILE2) FORMAT OF FILE IS DIFFERENT FROM WHAT IT SHALL BE CORRESPONDING TO FLOAD SCRIPTCORRECT ME IF I AM WRONG AND SUGGEST

Hi All,

I have requirement where I have to build(genrate) an excel sheet from a table but it can have more then 2Lakh - 3Lakh records and as we all know one excel sheet cant have more then 65K records so more then 65K records should be on next sheet in the same excel sheet and this process will be part of job on UNIX Box.

using BTEQ EXPORT ? or anything else?

And second that excel sheet should be stored into database table as BLOB data. does any buddy have exp in java or c lang. bec I am not sure how we can achieve this.

I have just upgraded my Teradata client utilities to version 13.0 and my fast export processes are not working correctly. Specifically, fast export is not returning all of the rows from the source table. Examining the log file does not help much, but it does appear to be an incomplete log file. The logtable is not removed and subsequent runs of the same fast export produce the same results, only with the log showing that a restart is in progress.This is all that appears in the log file: ======================================================================== = = = FastExport Utility Release FEXP.13.00.00.009 = = Platform WIN32 = = = ======================================================================== = = = Copyright 1990-2009 Teradata Corporation. ALL RIGHTS RESERVED. = = = ========================================================================**** 15:34:12 UTY2411 Processing start date: MON FEB 22, 2010 ======================================================================== = = = Logon/Connection = = = ========================================================================0001 .LOGTABLE pp_scratch_gcoa.SurveyFacts_log;0002 .LOGON caracal/pp_gcoa_user,;**** 15:34:15 UTY8400 Teradata Database Release: 12.00.02.38**** 15:34:15 UTY8400 Teradata Database Version: 12.00.02.35**** 15:34:15 UTY8400 Default character set: ASCII **** 15:34:15 UTY8400 Current RDBMS has UDT support**** 15:34:15 UTY8400 Current RDBMS has Large Decimal support**** 15:34:15 UTY8400 Maximum supported buffer size: 1M**** 15:34:15 UTY8400 Data Encryption supported by RDBMS server**** 15:34:19 UTY6211 A successful connect was made to the RDBMS.**** 15:34:19 UTY6217 Logtable 'pp_scratch_gcoa.SurveyFacts_log' has been created. ======================================================================== = = = Processing Control Statements = = = ========================================================================0003 .BEGIN EXPORT SESSIONS 4 ;0004 SELECT * FROM pp_scratch_gcoa.TDImporter;0005 .EXPORT OUTFILE "C:\TDImporter\E2E\OLELoadScripts\Survey_Fact.amj" AXSMOD Oledb_Axsmod 'batch' ;0006 .END EXPORT; ======================================================================== = = = FastExport Initial Phase = = = ========================================================================Any help or feedback would be appreciated.

Hi,

I'm running Win7 64bit and am connected to a v6.2 Teradata db in SQL Assistant 13.0, however none of the Views or Tables are showing up in the Data Source Explorer. I can query the Views just fine; just can't see them in the tree view.

Any thoughts/suggestions?

Thanks in advance!

All,I'm facing an issue with mloading of a table via Informatica - I get a "Broken Pipe" error which I believe is a very common error. I tried the usual steps of releasing the mload lock and deleting the work tables but the issue is still unresolved. I'm working in a shared and controlled environment and need to reach out to the DBA or other teams to delete anything for us so I cannot try out anything much myself. Can someone help me identify what could be the source of this issue and where I should be looking? I've pasted the loader log below. Thanks in advance for your help.----------------------------------------------------------------Teradata Parallel Transporter Version 12.00.00.00Execution Plan generation started.Execution Plan generation successfully completed.Job id is r2_agg_rx_pres_prod_grp1_out-2933, running on rnyldwhtstapp01.internal.comFound CheckPoint file: /opt/teradata/client/tbuild/12.00.00.00/checkpoint/r2_agg_rx_pres_prod_grp1_outLVCPThis is a restart job; it restarts at step MAIN_STEP.Teradata Parallel Transporter DataConnector Version 12.00.00.00PRODUCER_OPERATOR: Instance 1 directing private log report to 'PL_R2_AGG_RX_PRES_PROD_GRP_out-1'.Teradata Parallel Transporter Update Operator Version 12.00.00.00CONSUMER_OPERATOR: private log specified: CL_R2_AGG_RX_PRES_PROD_GRP_outPRODUCER_OPERATOR: Instance 1 restarting.PRODUCER_OPERATOR: DataConnector Producer operator Instances: 1CONSUMER_OPERATOR: connecting sessionsCONSUMER_OPERATOR: preparing target table(s)CONSUMER_OPERATOR: entering DML PhaseCONSUMER_OPERATOR: entering Acquisition PhasePRODUCER_OPERATOR: Operator instance 1 processing file '/fproc/infa_repos_stst/TgtFiles/r2_agg_rx_pres_prod_grp1.out'.PRODUCER_OPERATOR: pmRepos failed. General failure (34): 'pmUnxDskSetPos: 'seek' error (Illegal seek)'PRODUCER_OPERATOR: Fatal error repositioning data.Operator(libdataconop.so) instance(1): RESTART method failed with status = Fatal ErrorPXTB_Restart: Operator restart error, status = Multi Phase ErrorTask(SELECT_2[0001]): restart completed, status = Multi Phase ErrorCONSUMER_OPERATOR: disconnecting sessionsPRODUCER_OPERATOR: Total files processed: 0.Job step MAIN_STEP terminated (status 12)Job r2_agg_rx_pres_prod_grp1_out terminated (status 12)

Hi All,I hope I am posting my problem in right forum.I have two datastage jobs which inserts data into a same table using tpump utility. One of them fails when both the jobs triggers at the same time.These datastage jobs first deletes data from the table based on a condition & then inserts data.Also, same control file is used in both the jobs.My gut feeling is because of same control file used in both the jobs or tpump in one of the job locking the table thus the other job is unable to access this table & fails.Please let me know if you are aware of this & can guide me in finding the exact reason.Shall appreciate early response from you.TIA.

I have float data type in sql server (SS) and float in Teradata (TD). I am reading from SS and pass directly to TD. If I use odbc, data moves as it. eg 1.2 moves as 1.2 in TD. But if I use mload in Informatica, it rounds data. So, 1.2 becomes 1.0, 1.5 becomes 2.0.Does anyone know how to fix it? 1.

Hi all,

I cannot seem to locate the TTU pack 13.00 64 bit online for download?

We have the 12.00 CD but it doesnt have 64-bit install. Is there someplace online i can download the TTU 64 bit? It seems like a very common download but i cant seem to locate it in the downloads section.

Thanks!

Hi all,I have installed Teradata Demo version 12.0 in my PC. While i compiled a simple stored procedure the following error was prompted.Error: C/C++ compiler is not installed.Does C/C++ compiler is required for running Stored Procedure?Where can i get this compiler and in which folder should i install in my PC.Thanks,TmrModest.

Dear users,Per recent communication, we are pleased to announce the consolidation/merger of the Teradata Discussion Forums and the Teradata Developer Exchange Forums. The integration of the forums will create a single location - http://forums.teradata.com - for you to participate in Teradata Forums to exchange ideas with your peers and to get access to important Teradata information.We expect the migration to occur on the weekend of February 20th, 2010.We will be migrating across all legacy forum content, and will migrate your user account as well. For users that do not already have an account on Teradata Developer Exchange ( http://developer.teradata.com ), we will create an account using your existing Teradata Forums details, and will generate a new temporary password for you. These new account details will be emailed to you.Please watch for additional notices that will provide information on timing, accessing and using your forum account. To read more about the consolidation visit: http://developer.teradata.com/blog/neilotoole/2010/02/teradata-forums-migration-imminentThank you for your patience during this process, and we look forward to seeing you on the new Teradata Forums.- The Teradata Forums team

HI All....

Will Teradata work in Vista installed PC....?
If yes ..pls lemme knw........

1. ) Is it possible in the Database Explorer pane to show databases in a hierarchical structure? How? If not, is there an enhancement for a future release planned?

I have two questions:1. What is an easy way to tell how many loader slots are currently tied up? Is there a query vs DBC that can be run?2. Why do the MLOAD sessions always show as IDLE on session monitor?

I have a bteq script that uses import command to load data from a text file to a Teradata table..set quiet on;.import vartext ',' file = \\MyServer\Proj\filename.txt.repeat *using BUSINESS_TYPE (varchar(255)),ACCOUNT_NB (varchar(255)), CNTRCT_DT (varchar(255))INSERT INTO ZALTS1RV.ZALV281_QRMEUR_DRO( BUSINESS_TYPE ,ACCOUNT_NB ,CNTRCT_DT ) VALUES( :BUSINESS_TYPE , :ACCOUNT_NB , :CNTRCT_DT );The script ran successfully and gave the output message : *** Finished at input row 5455 at Wed Feb 10 15:03:50 2010 *** Total number of statements: 5455, Accepted : 5429, Rejected : 26 Is there anyway to capture those rejected rows and write into another file ?Thanks

I have a decimal (17,2) field which has a default value of 999999999999999.99 (9(15).((2)). When mload'ing that data to teradata the value is being rounded to 1000000000000000.00 (1 followed by 15 0's.00) which is causing a decimal overflow error. Also the default value is being modified.I tried the cast statement for similar data in TD and it did the same. The cast statements that I tried are as below:select CAST(999999999999999.99 as decimal(17,2)) as test ;( 9(15).9(2) ) gives 1000000000000000.00 as output but, select CAST(99999999999999.99 as decimal(17,2)) as test;( 9(14).9(2) ) gives 99999999999999.99 ( 9(14).9(2)) Please let me know if there is a way to have it mloaded as-is.

After installing .NET 3.5 and Mapinfo 10 SQL Assistant 13 will freeze when running a query a second time from the query window. You can open a new window run a query, and then the next query will freeze. It just keeps showing the query is running, but in the back ground the query has completed. Hitting the cancel button does not work.

I would like to capture the number of rows inserted. I need to do this all in a stored procedure. Here is my sql for inserting the rows,Insert into DEV_CORE.TABLE1(SurrogateKey, COL1, COL2...)Select SURROGATEKEY,COL1,COL2From DEV_STG.TABLE1 WHERE NOT EXISTS ( SELECT 1 FROM DEV_CORE.TABLE1 as CORE1 WHERE CORE1.COL1= STG.COL2 AND CORE1.COL2= STG.COL2 )So once the records are inserted i need to find out how many were inserted. I m thinking count* or some how use row_number ?? Please help. Thanks.

Hi,I am looking for anyone who can provide information on their experiences with xml services.Just comments surrounding issues like.Installation, any problems / special requirementsThe impact on the nodes - any additional overheads that should be considered.Any issues in the useage.Many Thanks.

Hi FriendHelp me to solve the following questionWhich of the following is a statement that is true regarding temporary space? A. Temporary space is spool space currently not used. B. Temporary space is permanent space currently not used. C. Temporary space is subtracted from SysAdmin. D. Temporary space is assigned at the table level.

I can no longer run SQL Assistant from a command line since upgrading to Version 13.0. I am using a -C parameter to name an ODBC data source, but since SQL Assistant now starts up with a default of Teradata.net, I'm thinking that it does not recognize the data source. Any suggestions?

Hi AllWhat is the difference between NUSI and full table scan?RegardsKIRAN

Hi All,Can tpump load data on a real time basis? as per the document provided by teradata we can use tpump for real time dataware housing .in tpump we read from a file which is in a predefined format.

We upgraded to a new version of Teradata SQL Assistant - version 13. Users are connecting through an ODBC connection to an Oracle database, then Adding a new database in the Database Explorer. They can drill on the Tables folder, expand a specific Table, BUT when they try to expand the Columns folder, they get the following error: "The input string was not in the correct format."They also get this error message when using the Tolls -> List Columns option. They have Read access to the table and can use the table and columns in a SQL command. But they can't see the columns of a given table.Any ideas on this?

I'm new to Mload Utility programming....Need an sample script which updates, inserts or delete on 2 or more tables from a single source file.Thanks in advance.Learntera

Hello Friends,we are planning to purchase Datastage 8.0 and install on Z-linux.I have a few questions, and I appreciate if some one can provide some info:1) Can we connect to Teradata V2R5 using Datastage 8.0 on Z-linux?2) if we have to upgrade our Teradata version, what is the version that works with Datastage 8.0 on Z-linux?Thanks

I have a table with a BLOB column containing a '.ZIP' file.I am able to manually fire a select and save it to a by providing the required file-.I just need a simple command to do it automatically where I can hardcode the PATH and EXTENSION to which I can save it.It can be incorporated either in a Bteq or a Stored Proc...

Hi,I have installed the Teradata demo version 12.0 in the Windows vista.When i try to access the BTEQ or any other utility for instance i have Login problem.Q1. Is there any default userid and Password to login to BTEQ? I had triedEnter your logon or BTEQ command:.login tduserPassword :tduserThe message i get it *** *** CLI error: MTDP: EM_NOHOST(224): name not in HOSTS file or names database. *** Return code from CLI is: 224 *** Error: Logon failed! *** Total elapsed time was 10 seconds. Teradata BTEQ 08.02.00.00 for WIN32. Enter your logon or BTEQ command:I am new to this BTEQ. Kindly help me in resolving this issue. Any help would be greatly appreciated.

Hi All,I'm trying to use TPT to load data directly from an Oracle table to Teradata. BUT.. I'm having a little problem with numeric data types.In oracle, I have a lot of different NUMBER(10,0) and other columns.Lets take one for example a column called DURATION. The actual data in this column fits in an INTEGER type easily. So in the SCHEMA in the TPT script file and in Teradata, i identified it as INTEGER.Now when running the TPT script, i get this error:Error: Row 1: data from source column too large for target columnColumn: DURATION, Data: '3'And it is like this for every Numeric column. I even tried to put in everywhere the exactly the same data type, NUMBER(10,0) in Oracle, DECIMAL(10,0) in Teradata and the TPT SCHEME, but no luck. Tried using DECIMAL(38,0) - no luck either. same problem.Is it an Oracle ODBC driver problem? (Oracle version is 10)The only way i see is to char everything and then in the later ETL stages to cast them back to whatever i need.Any help would be great!

Hi All,I have a multiload error in datastage. UTF8 the input and the charset of the muliload script too.I have 6706 and 6705 errors in Teradata. What can I do to supress this type of problems?I don't have staging table for this.

Hi All,
I'm trying to use TPT to load data directly from an Oracle table to Teradata. BUT.. I'm having a little problem with numeric data types.
In oracle, I have a lot of different NUMBER(10,0) and other columns.
Lets take one for example a column called DURATION. The actual data in this column fits in an INTEGER type easily. So in the SCHEMA in the TPT script file and in Teradata, i identified it as INTEGER.
Now when running the TPT script, i get this error:

Error: Row 1: data from source column too large for target column
Column: DURATION, Data: '3'