1550 - 1600 of 2379 tags for Tools

What will happen if we collect statistics on an empty table?Do we face any complexities because of this?

Hi,In our current project we are using Informatica and Teradata.All the operations done in Informatica can be done by Teradata SQL also.So can anybody tell me in which scenarios Informatica is better than Teradata?

I have installed the Teradata Express 12.0 in my PC for R&D purpose. I have executed the FastExport utiliy. I wanted to Xport the data to Windows notepad. The Utility is working fine, it is fetching the correct number of records from the Teradata Database but the BYTE data from the Teradata to the notepad is in unreadable format. I ran the utility with the default Character Set specification - ASCII and also executed the same by setting the Character Set specification to UTF8. But still the data in the notepad is in unreadable format. I am not getting the BYTE data in Teradata Database in a readable format in the notepad. I hve come across the UDF --> udf_toChar_mk183200 to convert BYTE to VARCHAR but dont know how to use it in the FastExport Job script. Can anyone tell me how to go about it.RegardsTorai

Hi - I need to generate a flat file using fastexport for Informatica to read.Teradata ---> Using fexp - Flat File ---> SQL SERVER load using Informatica.I need a fixed width or comma separated file.What should be the format of my fasexport script ?MODE RECORD FORMAT ? ; ?

DELETE FROM Employee WHERE Term_date > (subquery)

HI I am getting the following error while connecting to Teradata from Java using Type 4 driver,here intresting thing is the same program working in the same environment as a standalone but when ever tried to run in application , we are getting the following error. Our application like a batch process application it is not running on any application server.Please help me2009-03-13 18-27-53.636:UPDTESVC:SEVERE:winbackpapp1:lvoserver0:9070:com.convergys.lvo.server.framework.messaging.socket.SocketEventAdapter$1:run():232:"accept of socket failed, msg=Address already in use"Exception in thread "main" java.lang.StackOverflowErrorat java.nio.MappedByteBuffer. (MappedByteBuffer.java:69)at java.nio.DirectByteBuffer. (DirectByteBuffer.java:151)at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:169)at java.util.zip.ZipFile$MappedZipFileInputStream. (ZipFile.java:632)at java.util.zip.ZipFile.getInputStream(ZipFile.java:307)at java.util.zip.ZipFile.getInputStream(ZipFile.java:287)at java.util.jar.JarFile.getInputStream(JarFile.java:393)at sun.misc.URLClassPath$JarLoader$1.getInputStream(URLClassPath.java:620)at sun.misc.Resource.cachedInputStream(Resource.java:58)at sun.misc.Resource.getByteBuffer(Resource.java:113)at java.net.URLClassLoader.defineClass(URLClassLoader.java:249)at java.net.URLClassLoader.access$100(URLClassLoader.java:56)at java.net.URLClassLoader$1.run(URLClassLoader.java:195)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:188)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:251)at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)at java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClass(ClassLoader.java:620)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)at java.net.URLClassLoader.access$100(URLClassLoader.java:56)at java.net.URLClassLoader$1.run(URLClassLoader.java:195)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:188)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:251)at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)at java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClass(ClassLoader.java:620)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)at java.net.URLClassLoader.access$100(URLClassLoader.java:56)at java.net.URLClassLoader$1.run(URLClassLoader.java:195)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:188)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:251)at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)at java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClass(ClassLoader.java:620)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)at java.net.URLClassLoader.access$100(URLClassLoader.java:56)at java.net.URLClassLoader$1.run(URLClassLoader.java:195)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:188)at java.lang.ClassLoader.loadClass(ClassLoader.java:306)at java.lang.ClassLoader.loadClass(ClassLoader.java:251)at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)at org.bouncycastle.asn1.nist.NISTObjectIdentifiers. (Unknown Source)at org.bouncycastle.jce.provider.symmetric.AESMappings. (Unknown Source)at org.bouncycastle.jce.provider.BouncyCastleProvider. (Unknown Source)at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)at java.lang.reflect.Constructor.newInstance(Constructor.java:494)at java.lang.Class.newInstance0(Class.java:350)at java.lang.Class.newInstance(Class.java:303)at sun.security.jca.ProviderConfig$3.run(ProviderConfig.java:240)at java.security.AccessController.doPrivileged(Native Method)at sun.security.jca.ProviderConfig.doLoadProvider(ProviderConfig.java:225)at sun.security.jca.ProviderConfig.getProvider(ProviderConfig.java:205)at sun.security.jca.ProviderList.loadAll(ProviderList.java:254)at sun.security.jca.ProviderList.removeInvalid(ProviderList.java:271)at sun.security.jca.Providers.getFullProviderList(Providers.java:158)at java.security.Security.getProviders(Security.java:422)at sun.security.jgss.ProviderList. (ProviderList.java:116)at sun.security.jgss.GSSManagerImpl. (GSSManagerImpl.java:28)at org.ietf.jgss.GSSManager.getInstance(GSSManager.java:130)at com.teradata.tdgss.jgssp2gss.SspiMechanism. (DashoA1*..)at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)at java.lang.reflect.Constructor.newInstance(Constructor.java:494)at java.lang.Class.newInstance0(Class.java:350)at java.lang.Class.newInstance(Class.java:303)at com.teradata.tdgss.jtdgss.TdgssManager.createObject(DashoA1*..)at com.teradata.tdgss.jtdgss.TdgssManager. (DashoA1*..)at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)at com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:583)at com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:601)at com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:694)at com.teradata.jdbc.AuthMechanism. (AuthMechanism.java:50)at com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)at com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)at com.teradata.jdbc.jdbc_4.TDSession. (TDSession.java:194)at com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection. (TeraLocalConnection.java:94)at com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:55)at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:216)at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:149)at java.sql.DriverManager.getConnection(DriverManager.java:525)at java.sql.DriverManager.getConnection(DriverManager.java:171)at com.convergys.tks.util.TKSConnectionPool.addConnection(TKSConnectionPool.java:161)at com.convergys.tks.util.TKSConnectionPool. (TKSConnectionPool.java:60)at com.convergys.tks.util.TKS_DAO.buildSources(TKS_DAO.java:99)at com.convergys.tks.util.TKS_DAO.init(TKS_DAO.java:44)at com.convergys.tks.plugins.bo.DailyBOCacheMiss.init(DailyBOCacheMiss.java:139)at com.convergys.lvo.server.manager.LVOManagerNode.initializeManagedPlugins(LVOManagerNode.java:607)at com.convergys.lvo.server.manager.LVOManagerNode.initialize(LVOManagerNode.java:579)at com.convergys.lvo.server.manager.LVOManagerNode. (LVOManagerNode.java:303)at com.convergys.lvo.server.updatesvc.LVOUpdateSvc. (LVOUpdateSvc.java:454)at com.convergys.lvo.server.updatesvc.LVOUpdateSvcAgent. (LVOUpdateSvcAgent.java:68)at com.convergys.lvo.server.updatesvc.LVOUpdateSvcAgent.main(LVOUpdateSvcAgent.java:102)

FYII found a slipperly SQL asssistant feature. I've just logged this with Terdata feedback, but not sure if it was the correct channel to log a software bug.Basically if you use Teradata SQL Assistant (QueryMan) I'd recommend *not* enabling the option to use multiple queries in a single session (menu -> tools -> options -> allow multiple queries). This option is evil :)It works fine to open multiple SQL queries, but the danger is later saving them. The default name for *all* of the multiple queries is the most recently opened or saved name. If you open thre sql queries, when you come to save them, each one will be saved with the name of the most recent query. Therefore your are likely to overwrite queries and lose work.You've been warned!!!CheersTim

I loaded Teradata demo on my sys. I want to open BTEQ but It was asking for login/password. Can know where i can find them. I didnt gave any username or password while i installed.

Hi,Can someone pls tell me how to skip the header row in the flat file, when importing using FastLoad script on Unix.ThanksNick --

I have a java program which connects to Teradata and backs up data into a local MS Access database. After installing TTU 12 it stopped working saying :java.sql.SQLException: [Microsoft][ODBC Driver Manager]UnknownThis was while accessing MS Access, not Teradata. This has happened across all systems which installed TTU 12.I thought maybe the problem lies in the ODBC setup. I was using a DSN-less connection so far to avoid having to setup User DSNs etc in various systems & with limited registry access :{Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");String database ="jdbc:odbc:driver={Microsoft Access Driver(*.mdb)};DBQ=c:/db2.mdb;DriverID=22;READONLY=true}";Connection con = DriverManager.getConnection( database ,"","");Statement st=con.createStatement();ResultSet rs=st.executeQuery("select * from login" );}So now I created an ODBC source and connected through that like this:String url = "jdbc:odbc:wombat";Connection con = DriverManager.getConnection(url, "oboy", "12Java");But still the same error occurs. When I opened up MS Access and tried to access the tables I got a single error dialog box with the single word : Unknown ...this is what is passed to the driver as well & appears in my Java SQL Exception. Now I know it appears to be a Java issue and having nothing to do with Teradata, but this has happened on 20 systems on the same day exactly after TTU 12 was installed and on widely separate geographic locations.All had Windows XP SP2 - 32 bitFurthermore the MS Access database was in Access 2000 version so I changed it to 2002-2003 version and tried again but got the same error : Unknown when I clicked to open any tables. Now MS Access 2003 can't open its own tables!! Then I tried this on a system with Office 2007 in it, and it works ! and JDBC too, so TTU 12 has not affected systems having Office 2007.My conclusion is that TTU 12 in some way broke the ODBC setup and affected MS Access settings even though there is no way it should. Can someone please make sense of all this as I have Office 2003 in my system and we are not upgrading to Office 2007. But I really need to access the tables in this database. How come this was not caught during testing ?..and please dont tell me to download the Office compatibility pack..TTU 12 is not supposed to even touch MS Office in any way !This has been posted before and no attempt was made to resolve it:http://www.teradata.com/teradataforum/Topic13050-10-1.aspx?Highlight=ms+access

Hi , I am getting the following error when doing a fast load.**** 15:21:00 RDBMS error 2635: Error tables are invalid OR TFL_SALE is in an invalid state for Loaddoes this mean the is a fastload lock on the table.Need help in solving this...Thanks Nick

I am being told by my teradata team that since they do not use LDAP they cannot make use of the security features of TDGSS Libarary, is this true?Does TDGSS has any dependencies?I read somewhere that TDGSS works for only network-attached clients and not for channel attached clients?what is the difference between the two?I know our team is using TD2 mechan

I am on a Windows XP system. When I run fastload like thisfastload < loadcmd.flit works correctly. When I try to run it and redirect the output file fastload < loadcmd.fl > loadcmd.logorfastload < loadcmd.fl > loadcmd.log 2>&1I get a null reference error from Windows. The error is The instruction at "0x00000000" referenced memory at "0x00000000". the memory co9uld not be "read".Anyone seen this type of error before? I am running fastload version 12.00.00.004.R

Hi. Due to the size of my history database and the number of connections I have, I decided to switch to using a separate history database for each connection type. I have no issues with SQL Assistant creating the history databases for any of my connections - with the exception of Merrant Tracker SQL Server databases. For some reason - these history databases do not get created as an Access database in my common folders. However - the history is being tracked. Each query I've run appears in the history pane. I'm simply trying to find where it's being stored. I've tried running a scan of all connected drives to my PC for any *.mdb files, but nothing appears that would be a history database for this specific connection. Does anyone else have this issue and/or does anyone have any suggestions as to where this history might be stored?Thanks,John

We recently upgraded several Windows servers to V12 of the Teradata Utilities as part of a hardware upgrade. The odd thing is that we have been able to successfully run utility jobs (FastExport, FastLoad, MultiLoad) on these servers without actually setting up an ODBC data source that points to our new Teradata database machine.The same thing has occurred on Linux servers we have upgraded. We have built test DataStage apps that extract from or load to the new Teradata machine - but we have not added any new entries to the Linux hosts file - we just run the DS job and, somehow, it connects correctly.I hate to go looking for trouble when something is working, but we are genuinely puzzled as to why it's working. How are the Teradata Utilities able to find and connect to this otherwise unknown machine?Any and all help is appreciated!

Hi, I'm using bteq to export data into a csv file for opening in ms excel.I need the selected data, along with the title of the data to show up in the excel file, however, the long title makes the size of the first column long, thus opening the excel file does not show the data in the first row (unless you double-click the edge of the row, so that the entire width of the cell will expand. Is there a way to format this such that I will still be able to see the first row upon opening the file?Here is my command:LOGON_STRING=`cat logon/bteq`bteq <<- EOF.LOGON ${LOGON_STRING};.SET TITLEDASHES OFF;.SET SEPARATOR ',';.EXPORT file=/mydir/test_file.csv.SET RTITLE "THIS IS THE TITLE OF THIS CSV FILE WE ARE EXPORTING FROM THE DATABASE";.SET FORMAT ON;SELECTtest_id (CHAR(8)) (TITLE 'TEST'),test_cd (TITLE 'CODE')from database.table;.SET FORMAT OFF;.EXPORT RESET.LOGOFFThe result is this: (When this is viewed in excel, the first column values under test is not visible since the size of the column is small, and the width of the title's column is large. Also, even if i set the length of TEST to be only 8Chars long, the column automatically adds extra character before the values.09/03/04 THIS IS THE TITLE OF THIS CSV FILE WE ARE EXPORTING FROM THE DATABASE Page 1 (this has blank spaces)TEST CODE(this has blank spaces)TF999003 TCLSOH65DS1(this has blank spaces)TF999002 TCMTOH3333E(this has blank spaces)TF999006 TCHGMIMNDS1(this has blank spaces)TF999005 TCWOTXPEAXD(this has blank spaces)TF999004 TCRTMIMTDS0In the unix .csv file, it shows up as:09/03/04 THIS IS THE TITLE OF THIS CSV FILE WE ARE EXPORTING FROM THE DATABASE Page 1(this has blank spaces)TEST ,CODE(this has blank spaces)TF999003,TCLSOH65DS1(this has blank spaces)TF999002,TCMTOH3333E(this has blank spaces)TF999006,TCHGMIMNDS1(this has blank spaces)TF999005,TCWOTXPEAXD(this has blank spaces)TF999004,TCRTMIMTDS0Any idea how i can remove the excess spaces before the first column's values?Thanks!

hi all can anybdoy tell me what is purpose of .export reset in bteqi know the bookish one but,i am not satisfied with that one

hi all can anybdoy tell me what is purpose of .export reset in bteqi know the bookish one but,i am not satisfied with that one

HiI have some doubts over integer formats...I created 3 tables and ran similar set of queries1)create table table1(a integer format '-(10)9')insert into table1 values(123456)sel a (char(6)) from table1;output=12)create table table2(a integer format '-9(10)')insert into table2 values(123456)sel a (char(6)) from table2;output=000013)create table table3(a integer format '(10)9')insert into table3 values(123456)sel a (char(6)) from table3;output=000012I am not able to understand why different outputs are comingand more precisely how signed/unsigned is handled here...also what is the differenece between '-9(10)' & '-(10)9'

Hello,I'd like to know how to expand the partitioning of an Aggregate Join Index.Thanks in advance

The error info :429: The monitor object(TDMon5.dll) is missing, or is not correctly registered.

Hi,Multiload will automatically creates Error table while loading from file to database. Is there any way to generate error files on unix box instep of error tables. Please help me in this case.Thank you,Regards,Kiran

Hi,I am getting below error while inserting data into table ** Error: Import data size does not agree with byte length. The cause may be: 1) IMPORT DATA vs. IMPORT REPORT 2) incorrect incoming data 3) import file has reached end-of-file.I am importing data from below file:cat /home/user/jjxyz 1abc 2BTEQ Script as below:.logon xyz/loginid,password.IMPORT DATA FILE = /home/user/jj,SKIP = 2.QUIET ON.REPEAT *USING NAME(VARCHAR(25)),ID(INTEGER)INSERT INTO DB.TABLE(NAME,ID)VALUES(:NAME,:ID);COMMIT;.LOGOFF..QUITMy table defination as:CREATE SET TABLE DB.TABLE,NO FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT ( NAME VARCHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC TITLE 'MANAGER_NAME' NOT NULL, ID INTEGER)PRIMARY INDEX ( ID );Please help me out of this.Thanks a lot.Regards,Ravi.

HiCan anybody tell me what we need to do when we are exporting1)integer2)character3)date to IBM environment(as client)..I am getting junk characters...Can anyone tell what kind of rules we need to follow when client is IBM and windows...while fast exporting the file

Hi,We have defined a workload as having classifications of < 1 Sec CPU Time and 'Include Single or Few Amp Queries Only' .Does anybody know if there is a definition of 'Few Amps' that is used ?

Hi ,I have a file with header, data racords and trailer. Data record is delimited with comma and header and trailer are fixed width.The header and trailer starts with (HDR,TRA). I need to avoid the header and trailer while loading the file with Multiload. Please help me in this case.Thank you,Regards,Kiran

Hi All,We have a requirement wherein Customer who has made purchases of at least $ 10000 in a 6 month period and made atleast a single visit a month in a 6 consecutive month period should be given a rating of A+B+. and Customer who has made purchases of at least $ 10000 in a 6 month period but not made atleast a single visit a month in a in a 6 consecutive month period should be given a rating of A+B-. sample data CUSTOMER_ID PURCHASE_DATE PURCHASE_AMOUNT101 02-MAR-07 10000101 02-APR-07 8000 101 05-MAY-07 10000 105 18-JAN-08 10000101 18-FEB-08 5000102 18-MAR-08 20000103 18-APR-08 3000102 23-MAY-08 2000104 02-JUN-08 10000103 02-JUN-08 30000101 02-JAN-08 51000With this kind of data we need to write a query to retrieve only those customers who have data in each of the last six months, like in this example it should be customer number '101'.Customer '103' should be ignored as it does not have data for last six months CONSECUTIVELYThanks & Regards,Praveen

Hi All, I need to estimate the Multiload, FastLoad, and BTEQ scripts in project. Can any body tell me that which which things will consider in complexity of each Multi load, Fastload, and BTEQ jobs.... and the time for each activity..Thanks in advance

I am calling the followingbteq .run file=/usr/wh/dev/login/login_td_procs.txt /usr/wh/dev/log/truncate_y00442p_ot_bcp_as400_extract_status.log Where /usr/wh/dev/login/login_td_procs.txt = .logon mpp/dev_data_v, ; database dev_data_v; /usr/wh/dev/sql/truncate_y00442p_ot_bcp_as400_extract_status.sql delete from ot_bcp_as400_extract_status; drop table ot_bcp_as400_extract_status_e1; drop table ot_bcp_as400_extract_status_e2; The following is the resulting log file on SuSE SLES10 +---------+---------+---------+---------+---------+---------+---------+---- .run file=/usr/wh/dev/login/login_td_procs.txt +---------+---------+---------+---------+---------+---------+---------+---- del.logon mpp/dev_data_v, ; +---------+---------+---------+---------+---------+---------+---------+---- database dev_data_v; +---------+---------+---------+---------+---------+---------+---------+---- +---------+---------+---------+---------+---------+---------+---------+---- ete from ot_bcp_as400_extract_status; +---------+---------+---------+---------+---------+---------+---------+---- drop table ot_bcp_as400_extract_status_e1; +---------+---------+---------+---------+---------+---------+---------+---- drop table ot_bcp_as400_extract_status_e2; +---------+---------+---------+---------+---------+---------+---------+---- *** BTEQ exiting due to EOF on stdin. *** Exiting BTEQ... *** RC (return code) = 2 However When run from RHELv3AS +---------+---------+---------+---------+---------+---------+---------+---- .run file=/usr/wh/dev/login/login_td_procs.txt +---------+---------+---------+---------+---------+---------+---------+---- .logon mpp/dev_data_v, *** Logon successfully completed. *** Transaction Semantics are BTET. *** Character Set Name is 'ASCII'. *** Total elapsed time was 1 second. +---------+---------+---------+---------+---------+---------+---------+---- database dev_data_v; *** New default database accepted. *** Total elapsed time was 1 second. +---------+---------+---------+---------+---------+---------+---------+---- +---------+---------+---------+---------+---------+---------+---------+---- delete from ot_bcp_as400_extract_status; *** Delete completed. One row removed. *** Total elapsed time was 1 second.

Are we able to write custom functions in fastexport?

What is the key factor that makes TPT better than other Load utilities.Given that it uses SQL like language syntax but invokes various Load utility protocols.How is it different and similar from other load utilities? Does it really, Impact performance and how?

Hi!!I am writing to see if you are currently available for a position in Ashburn,VA and Atlanta,GA.. I am looking for a Teradata Production DBA for a 12+ months contract. If you or anyone you may know of is interested, please email me at your earliest convenience.Location: Ashburn,VA and Atlanta,GA.Duration: 12+ monthsJob Description:Sr. Teradata Production DBA with Teradata Datawarehouse experience, who will be responsible for design, installation, configuration, backup, recovery, performance tuning, replication, active data warehousing.He should be strong in database designing, tuning, and troubleshooting. Ability to write complex stored procedures Understand SOX requirements pertaining to Datawarehousing.Familiarity with SQL.Experience with Net Backup and Arc utilities; BTEQ and Unix Scripting languages.Ability to create and understand data models.Skills:AIX, Applications, Computer, Database, Data Warehouse, DataWarehouse, DBA, Engineering, Linux, Management, Performance, Solaris, SQL, System, Systems, Teradata, UnixSincerely,RaadGlobal Resource Management, Inc. (GRMI)678 935 0437 (Direct)770 729 9222 (Fax)raad@grmi.netwww.grmi.net"GRMI- Not Just A Name A Commitment to Excellence"DiversityBusiness.com's Awards Information 2007 Award Rank Press ReleaseTop Women Owned Businesses in America (Div500Women-2007) 79 DownloadTop subcontinent Asian American Owned Businesses in America (Div100SAA-2007) 22 DownloadTop Asian American Owned Businesses in America (Div100ASA-2007) 48 DownloadTop Diversity Owned Businesses in America (Div500-2007) 154 DownloadTop Small Businesses in America (Div500SmallBiz-2007) 116 DownloadTop Small Business in Georgia (Div100SmallBiz-2007) 8 DownloadTop Diversity Owned Businesses in Georgia (Div100State-2007) 10 DownloadTop Woman Owned Business in Georgia (Div100WomenState-2007) 7 Download

Hi I am running fastexport with various combinations and getting different results which i am unable to understand1).EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (coalesce(Wrls,'') (char(10))) ||(coalesce(Wrls,'') (char(15))) FROM BSAMPLE 1;OUTPUT---SA-ABCDGroup Mechanics2).EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (coalesce(Wrls,'') (char(10))) ||',' ||(coalesce(Wrls,'') (char(15))) FROM BSAMPLE 1;OUTPUT----> --SA-ABCD,Group Mechanics(junk characters at beginning)3) .EXPORT OUTFILE 'C:\Documents and Settings\Desktop\test5.txt' format text mode record; SELECT (COALESCE(TRIM(Wrls) , ' ')) ||','||(coalesce(TRIM(Wrls) ,'')) FROM BSAMPLE 1;Output-->SA-ABCD,Group MechanicsBasically it seems like(COALESCE(TRIM(Wrls_Actvty_Cd) , ' '))=(coalesce(Wrls_Actvty_Cd,'') (char(10))) But how?i am not able to understand...Please explain me if i have five cols in my table1)integer2)date3)character4)varchar5)numericPlease explain what data conversions we need to do at time of exporting the file...

Hi, I have two fast export scripts to make. The first one will query the table and return multiple records saved in an output file. The second one will summarize the data, one of the fields inside the second fast export script is the summary (count) of the no. of records obtained from the fast export. To do this, I am thinking of counting the number of lines inside the output file of fast export #1 (wc -l) and assigning it to a variable (this is in ksh). I want to be able then, to get the value of that variable during runtime of the fast export #2. Is there a way to get the value of that variable? or is there an alternative way for me to count the no. of records (1 record =1line) from the output file of fast export #1 and use that in fast export #2?

Hi all, I am a beginner to Teradata here, I wonder if this question had been asked before, that is is there any best known format to be used for TPUMP and Multiload that will give some edge to the performance of both these TTUs? Thanks

Hi AllI know we can get MLoad Script auto generated using Fexp Are there any similar short cuts to generate Tpump or Fast load script instead of coding entire thing?

Hi,my requirement is , I am getting some files every day to unix server and I need to load them to teradata tables using multiload . I need to develop a shell script which will read the files from the particuler directory and pass to multiload . please can any one help in this how to pass the shell variable to multiload script.

Hi All,Please can anyone let me know the points to remember while Fastexporting data from Mainframes system and loadingthe data back onto a Non Mainframe system using Fastload/Multiload.?I have Fastexported a file using fastload format and MLscript option from Mainframe.Now i am trying to load the same file onto a Linux system but getting format incompati

Hi All,Below is my sample control file. I would like to add the tracing level or log level and sending those information to the log file. How to achieve that and where to add in the below control file. Please help me ASAP.Control file content :dateform ANSIDATE;errlimit 1000000;tenacity 4; sessions 1;sleep 6;LOGON tdp1/rmw_rep,Rfg;DROP TABLE SA_RMW_T_D.ET_WC_ACTIVITY_DS ; DROP TABLE SA_RMW_T_D.UV_WC_ACTIVITY_DS ; DELETE FROM SA_RMW_T_D.WC_ACTIVITY_DS ALL;set record unformatted;define INTEGRATION_ID (CHAR(30) , NULLIF = '*'), DATASOURCE_NUM_ID (CHAR(12) , NULLIF = '*'), X_LN_ACTV_CANCEL_REASON (CHAR(30) , NULLIF = '*'), X_LN_BACK_OFFC_CREATE_DTS (CHAR(19) , NULLIF = '*'), X_LN_CONT_LGL_EDU_CR_NBR (CHAR(20) , NULLIF = '*'), X_LN_PLTFM (CHAR(30) , NULLIF = '*'), X_LN_TRAIN_CRSE_NAME (CHAR(50) , NULLIF = '*'), X_LN_DELIVERY_MTHD (CHAR(100) , NULLIF = '*'), X_LN_MEETING_TYPE (CHAR(100) , NULLIF = '*'), X_LN_NO_ATTENDEES (CHAR(12) , NULLIF = '*'), X_LN_NO_EXEC_ATTENDEES (CHAR(12) , NULLIF = '*'), X_LN_NO_NEW_ATTENDEES (CHAR(12) , NULLIF = '*'), X_LN_VALUE_STMT (CHAR(255) , NULLIF = '*'), X_LN_VALUE_STMT_TYPE (CHAR(50) , NULLIF = '*'), X_LN_SALES_REP_LAST_NAME (CHAR(50) , NULLIF = '*'), X_LN_SALES_REP_FST_NAME (CHAR(50) , NULLIF = '*'), X_LN_SALES_MGR_LAST_NAME (CHAR(50) , NULLIF = '*'), X_LN_SALES_MGR_FST_NAME (CHAR(50) , NULLIF = '*'), X_LN_VP_LAST_NAME (CHAR(50) , NULLIF = '*'), X_LN_VP_FST_NAME (CHAR(50) , NULLIF = '*'), X_LN_PREV_ACT_ID (CHAR(15) , NULLIF = '*'), X_LN_CONSULT_HOURS (CHAR(22) , NULLIF = '*'), X_LN_ACT_PREFERENCE (CHAR(30) , NULLIF = '*'), X_LN_CREATED_DT (CHAR(19) , NULLIF = '*'), X_LN_CREATED_DT_WID (CHAR(12) , NULLIF = '*'), X_LN_APPT_DURATION_MIN (CHAR(24) , NULLIF = '*'), X_LN_CREATOR_LOGIN (CHAR(50) , NULLIF = '*'), X_LN_CONTACT_TRGT_REASON (CHAR(30) , NULLIF = '*'), newlinechar (CHAR(:CF.PadLength))file=:CF.ImportFileName;show;begin loading SA_RMW_T_D.WC_ACTIVITY_DS errorfiles SA_RMW_T_D.ET_WC_ACTIVITY_DS, SA_RMW_T_D.UV_WC_ACTIVITY_DScheckpoint 0 ;insert into SA_RMW_T_D.WC_ACTIVITY_DS ( INTEGRATION_ID, DATASOURCE_NUM_ID, X_LN_ACTV_CANCEL_REASON, X_LN_BACK_OFFC_CREATE_DTS, X_LN_CONT_LGL_EDU_CR_NBR, X_LN_PLTFM, X_LN_TRAIN_CRSE_NAME, X_LN_DELIVERY_MTHD, X_LN_MEETING_TYPE, X_LN_NO_ATTENDEES, X_LN_NO_EXEC_ATTENDEES, X_LN_NO_NEW_ATTENDEES, X_LN_VALUE_STMT, X_LN_VALUE_STMT_TYPE, X_LN_SALES_REP_LAST_NAME, X_LN_SALES_REP_FST_NAME, X_LN_SALES_MGR_LAST_NAME, X_LN_SALES_MGR_FST_NAME, X_LN_VP_LAST_NAME, X_LN_VP_FST_NAME, X_LN_PREV_ACT_ID, X_LN_CONSULT_HOURS, X_LN_ACT_PREFERENCE, X_LN_CREATED_DT, X_LN_CREATED_DT_WID, X_LN_APPT_DURATION_MIN, X_LN_CREATOR_LOGIN, X_LN_CONTACT_TRGT_REASON) VALUES ( :INTEGRATION_ID, :DATASOURCE_NUM_ID, :X_LN_ACTV_CANCEL_REASON, :X_LN_BACK_OFFC_CREATE_DTS, :X_LN_CONT_LGL_EDU_CR_NBR, :X_LN_PLTFM, :X_LN_TRAIN_CRSE_NAME, :X_LN_DELIVERY_MTHD, :X_LN_MEETING_TYPE, :X_LN_NO_ATTENDEES, :X_LN_NO_EXEC_ATTENDEES, :X_LN_NO_NEW_ATTENDEES, :X_LN_VALUE_STMT, :X_LN_VALUE_STMT_TYPE, :X_LN_SALES_REP_LAST_NAME, :X_LN_SALES_REP_FST_NAME, :X_LN_SALES_MGR_LAST_NAME, :X_LN_SALES_MGR_FST_NAME, :X_LN_VP_LAST_NAME, :X_LN_VP_FST_NAME, :X_LN_PREV_ACT_ID, :X_LN_CONSULT_HOURS, :X_LN_ACT_PREFERENCE, :X_LN_CREATED_DT, :X_LN_CREATED_DT_WID, :X_LN_APPT_DURATION_MIN, :X_LN_CREATOR_LOGIN, :X_LN_CONTACT_TRGT_REASON) ; end loading;logoff;

Hi all,Ina bteq ,I need to pick value from a query assign that to a variable and using same variable in if/else condition.can any bdy suggest how to assign value toa variable and use if condition there.thanks in advance.Kapil

Hi I am trying to Run Multiload by exporting date in a file and then using it...Following is the code i am running.EXPORT REPORT DDNAME DELDATE sel end_date (title '') from tableA;.EXPORT RESET while using in MULTILOAD i am giving as .ACCEPT DELDTE FROM FILE DELDATE;DELETE FROM TABLE B WHERE CURRENT_DATE<(&DELDTE-10);but it is replacing it asDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-10);And coming out with an error.but it shud beDELETE FROM TABLE B WHERE CURRENT_DATE<(2009-01-23 -10);Please clarify how to export a date format correctly and use it in MLOAD USING ACCEPT OPTION.

I need to export data from teradata and import it in to mysql [cluster]. I was wondering if anyone has an example of fastexp script which creates mysql import friendly files.Thanks.Vadim

Hi,i have to export repeatidly some 30 M lines in a "text" format and variable length if possible.Fastexport (V2R6) adds a two-bytes indicator (as specified in the Fastexport doc.) at the beginning of the records, wathever the mode, Indicator or Record.Does anybody know a way not to keep those two bytes ?I tried to cast the whole chain - a concatenation o

Hi All,I need to create a copy of Database into a newly created database. My old data base has around 200 elements comprising of macros, stored procedures and tables/ views. Is there any teradata utility available to do it. As it will take a long time to copy each individual element.It's an urgent requirement.Thanks in advance

When I run a query using the TdCommand.ExcuteReader I am getting back an Integer when the type should be TdTime. The data type on the field in question is time within the TeraData DB. I have also set "Enable TdDateTime = true;" in the connection string. Please advise.

hi i have install teradata demo . i want to connect to the server . then one error is coming . "odbc 28000: not enough information for log on " plz help me to resolving that error Neeraj shukla

Hello I fast exported as following.logtable retail.restartlog1_fxp;.logon localtd/dbc,dbc;.begin export sessions 4;.export outfile C:\Users\test\Documents\retail14.txt;sel * from retail.area_3;.end export;.logoff;and the output it gave me is p

Hi!I tried installing Teradata Express Edition 12.0 on a Windows Server 2008 R2 server, but it did not work.The installer did not recognize the operating system and claims: "Your Microsoft Windows version is not listed above.

We need to use bteq on one of our unix boxes. DNS configuration of the box doesn't use /etc/hosts file, just name server and I have problem connecting to TD. From the box I can ping TD server, ssh to it, telnet to the td port, etc. all just fine. But if I try to connect using bteq I am getting: *** CLI error: MTDP: EM_NOHOST(224): name not in HOSTS file or names database. *** Return code from CLI is: 224 *** Error: Logon failed!What are we missing? what is "names database" and how can I configure it.Thanks,Vadim

I had a bright idea and wrote a Unix shell script that requires a Teradata logon and password as parameters. I thought I could then could substitute that logon on and password into my fastload script without saving it into a file anywhere. However my attempt with sed failed as fastload thinks the sed statement the file. Has anyone managed to overcome this?This is what I did:First I tried the statement below on its own, which works, outputting the fastload script with the logon and password in the right place (logonid was the dummy word I had put after .logon and the IP address before the ;sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txt Then I tried it with fastload commandcagcrd:BADEV:/badata/wtdata/scripts >fastload < sed 's/logonid/logon,password/g' <FLOAD_DIM_BRANCH.txtA file or directory in the path name does not exist.ksh: sed: 0403-016 Cannot find or open the file.Reading up on UNIX it looks like redirection does not let you do this but I have hopes that someone out there knows a way around it.Regards,Susan