Showing posts with label ODBC. Show all posts
Showing posts with label ODBC. Show all posts

03 February 2021

Data Migrations (DM): Conceptualization IV (Data Access)

Data Migration
Data Migrations Series

Once the data sources for a Data Migration (DM) were identified the first question is how the data can be accessed. The legacy systems relying on ODBC-based databases are in theory relatively easy to access as long they allow the direct access to their data, which would enable thus a pull strategy. Despite this, there are organizations that don’t allow the direct access to the data even for read-only operations, being preferred to push the data directly to the consumers (aka push strategy) or push the data to a given location from where the consumer can use the data as needed (aka hybrid strategy). 

The direct access to the data allows in theory the best flexibility as the solution can extract the data when needed and this especially important during the initial phases of the project when the data need to be pulled more frequently until the requirements and logic is stabilized. A push strategy tends to add additional overhead as usually somebody else oversees the data exports, respectively the data need to be prepared in the expected format. On the other side, it would make sense to make an exception for a DM and allow the direct access to the data. 

 Hybrid strategies tend to be more complex and require additional resources or overhead as the data are stored temporarily at a separate location. Unfortunately, in certain scenarios this is the only approach can be used. Are preferred data files that allow keeping the integrity of the data and facilitate data consumption. Therefore, tabular text files or JSON files are preferred in the detriment of XML or Excel files. It’s preferable to export one data structure individually then storing parent-child solutions even if the latter can prove to be useful in certain scenarios. When there’s no other solution one can use also the standard reports available in the legacy systems.

When storing data outside the legacy systems for further processing it’s recommended to follow organization’s best practices, respectively to address the data security and privacy requirements. ETL tools allows accessing data from password protected areas like FTP, OneDrive or SharePoint. The fewer security layers in between the lower is in theory the overhead. Therefore, given its stability and simplicity FTP might prove to be a better storage solution than OneDrive, SharePoint or other similar technologies.

Ideally the extraction/export mechanisms should use the database objects that encapsulate already the logic in the legacy systems otherwise the team will need to reengineer the logic – for master data this can prove to be easy, though the logic of transactional data like on-hand or open invoices can be relatively complex to reengineer. Otherwise, the logic can be implemented directly in the extraction/export mechanisms or sometimes is more advisable to create database objects (usually on a different schema) on the legacy systems and just call the respective objects. 

When connecting directly to the data source it’s advisable using the data provider which allows the best performance and flexibility, however several tests might be needed to determine the best fit. It would be useful to check the limitations of each provider and find a stable driver version.  OLEDB and ADO.Net data providers provide in general a good performance, though native drivers of the legacy systems can be a better option upon case. 

Some legacy systems allow the access to their data only via service-based technologies like OData. OData tends to have poor performance for large data exports than standard access methods and therefore not indicated in such scenarios. In such cases might be a good idea to export the data directly from the legacy system. 

Previous Post <<||>> Next Post

12 August 2009

DBMS: Open Database Connectivity (Definitions)

"A set of drivers provided by different manufacturers that allows client applications to connect to database servers without regard to native database language. These drivers interpret data requests and translate those requests into a format that the database understands." (Patrick Dalton, "Microsoft SQL Server Black Book", 1997)

"A database-neutral API for Microsoft Windows applications. ODBC is designed to enable any database vendor to write a driver that supports the API." (Owen Williams, "MCSE TestPrep: SQL Server 6.5 Design and Implementation", 1998)

"A database-material application programming interface (API) aligned with the American National Standards Institute (ANSI) and International Organization for Standardization (ISO) standards for a database call-level interface (CLI). ODBC supports access to any database for which an ODBC driver is available." (Microsoft Corporation, "SQL Server 7.0 System Administration Training Kit", 1999)

"A data access application programming interface (API) that supports access to any data source for which an ODBC driver is available. ODBC is aligned with the American National Standards Institute (ANSI) and International Organization for Standardization (ISO) standards for a database Call Level Interface (CLI)." (Anthony Sequeira & Brian Alderman, "The SQL Server 2000 Book", 2003)

"A set of standards that allow applications that are not dependent on any one specific database to process SQL statements against any database that supports SQL." (Bob Bryla, "Oracle Database Foundations", 2004)

"A data access application programming interface (API) that supports access to any data source for which an ODBC driver is available. ODBC is aligned with the American National Standards Institute (ANSI) and International Standards Organization for (ISO) standards for a database Call Level Interface (CLI)." (Thomas Moore, "EXAM CRAM™ 2: Designing and Implementing Databases with SQL Server 2000 Enterprise Edition", 2005)

"Interface used to provide access to data sources that provide an ODBC driver. The driver is a software file that provides instructions for how to access the data source." (Sara Morganand & Tobias Thernstrom , "MCITP Self-Paced Training Kit : Designing and Optimizing Data Access by Using Microsoft SQL Server 2005 - Exam 70-442", 2007)

"A data access API that supports access to any data source for which an ODBC driver is available." (Jim Joseph, "Microsoft SQL Server 2008 Reporting Services Unleashed", 2009)

"Database middleware developed by Microsoft to provide a database access API to Windows applications." (Carlos Coronel et al, "Database Systems: Design, Implementation, and Management" 9th Ed., 2011)

"A call-level interface (CLI) for interacting with databases. ODBC provides routines to allocate and deallocate resources, control connections to the database, execute SQL statements, obtain diagnostic information, control transaction termination, and obtain information about the implementation." (Craig S Mullins, "Database Administration: The Complete Guide to DBA Practices and Procedures" 2nd Ed., 2012)

"A standard database access technology developed by Microsoft, the purpose of which is to allow accessing any DBMS from any application, regardless of which DBMS is managing the data." (Craig S Mullins, "Database Administration: The Complete Guide to DBA Practices and Procedures" 2nd Ed., 2012)

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.