Search results
Results from the WOW.Com Content Network
Answer. Share. 1 answer. 714 views. High Level Design (HLD) is the overall system design - covering the system architecture and database design. It describes the relation between various modules and functions of the system. data flow, flow charts and data structures are covered under HLD. Low Level Design (LLD) is like detailing the HLD.
2) Design Phase. a) High level design document (HLD) an etl architect and dwh architect participate in designing a solution to build a dwh. an HLD document is prepared based on business requirement. b) Low level design document (LLD) based on HLD, a senior etl developer prepare low level design document
Login. One login for all things Informatica. If you are registered with any of the following Informatica applications, you can log in using the same credentials: Informatica.com. Informatica Documentation Portal. Informatica Marketplace. Informatica Network. Informatica Partners. Informatica Success.
2 answers. 41 views. Those are SAP ABAP programs that are used to extract metadata from SAP systems. SAP is not like any other relational databases. Metadata extraction is not straight forward. Informatica built those ABAP programs for metadata extraction purpose. Surely you have any kind of high level architecture or deeper explanation.
Say for example we have one Domain Node (node01) and one Worker Node (node02). We want to achieve the high availability of Informatica. If the Domain Node goes down, we want the worker node to act as gateway. We tried to achieve this by selecting the other node as gateway node. However, when we switch off the Domain Node (node01), the domain is ...
Axon - Datasets and system interfaces. Members of our team want to create a data set displaying the data being moved between the 'hops'. For example, a dataset from our DW to our DL landing and then a dataset from our DL landing into discovery. I feel this would be incredibly confusing but the rationale is that may be not all the data in ...
Hi Team Could you pls provide the some high level steps. We want to Upgrade Informatica PowerCenter from 9.6.1 to 10.1. with Parallel method. Our Informatica services are running on a Linux Platform with 2 node architecture with Oracle DB. Thanks Kittu
Enterprise Data Catalog. Actually, it depends on the source type that you will be scanning. For relational sources like Oracle, we expect the user account you specify to have read access to the system tables (aka data dictionary views) containing the metadata. Then, we query the metadata from the system tables and ingest it into the catalog.
POWER (2,57) The target is a flat file for testing purpose. Also I click "enable high precision" in session properties. suppose that Informatica integration service can support precision of 20 digits. After I execute the session I got the following value: 144115188075855900. But the correct value is 144115188075855872 (18 digits)
Hello All, I am responding to a request wherein we are asked to migrate Ab-Initio code to Informatica PowerCenter.</p><p> </p><p>If you have done similar work in the past, any thoughts/guidance will be very much appreciated.</p><p> </p><p>Any ETL tool to INFA migration thought process is also fine.</p><p> </p><p>At a very high level, I am planning to convert the Ab-Initio XML file to look ...