Relieving the mainframe
Our customers are our best references.
tcVISION is used in daily productive IT operations on different platforms and with a large number of input databases and files as well as output databases, cloud and big data systems.
The co-existence and cooperation of these different platforms and data storage systems is constantly changing. New technologies are emerging and changing the image of a modern IT. A look back over the last 10 to 15 years makes this clear. The presence and dominance of cloud and big data deserves to be specially mentioned here. These constantly changing landscapes and technologies are a challenge for BOS. We have taken on this challenge and will continue to do so in the future. Hardly any other solution on the market offers such a variety of integration options, areas of application and processing platforms as tcVISION.
In today's blog we want to take a closer look at certain areas of customer applications. Please, get in touch with our sales department if you require more detailed information.
This blog is about one of our European customers. It is a service company in the financial sector. The services provided by the customer are used by well-known European companies and banks.
tcVISION has been in use at the customer since 2018. Before the contract was signed, tcVISION was analyzed in a PoC (Proof of Concept) and compared with solutions from IBM and Oracle. The decision fell in favor of tcVISION.
The customer works with an IBM mainframe and the operating system z/OS. The data is held in VSAM files and DB2 databases. Oracle is used as the output destination, which was also used in the PoC.
The change data is detected on the mainframe in real time for Db2 and VSAM (change data capturing) and applied into Oracle.
After the productive start-up of the first project, the output scenario was extended to Apache KAFKA. The data to be processed is sent by tcVISION from the mainframe via KAFKA to a series of micro services and stored there in Oracle.
The customer's existing infrastructure was based on different environments (e.g. test, migration, pre-production, production, etc.). An important requirement of the customer was that tcVISION had to offer a possibility to move or copy repository entries of the metadata and processing rules as well as any existing processing processes between these different environments. tcVISION offers the tcVMIGRATE batch tool for this purpose,. The tool can be used to carry out complete or selective transfers between repositories in different environments. The migration of this data is performed with the actual repository data.
In addition to the KAFKA and Oracle output platforms, certain mainframe data is also entered into an hbase database. Another project is the streaming of mainframe data with tcVISION to the Amazon Web Services (AWS) cloud. Analysis of these data streams is performed by Kinesis.
tcVISION provides the data that is created, for example, in online processing on a mainframe system (CICS, IMS/DB, Software AG Adabas/Natural, CA IDMS) and is captured by tcVISION in real time. According to the definitions and rules stored in the repository, the data is sent as a data stream to a big data environment (streaming). tcVISION also offers the same options in the field of batch processing. Several methods are offered by tcVISION for determining batch change data and sending it as a data stream for real-time analysis to a big data environment (log file processing, real-time capturing, batch compare).
tcVISION is ideally suited to connecting the traditional mainframe (regardless of whether the operating system is z/OS or z/VSE) to a big data environment or a cloud via data streaming.
An overview of all supported input and output targets can be found here.