APC 技術ブログ

株式会社エーピーコミュニケーションズの技術ブログです。

株式会社 エーピーコミュニケーションズの技術ブログです。

The Future is Open: Data Streaming in an Omni-Cloud Reality

Introduction

I'm Chen from the Lake House Department of the GLB Division. I wrote an article summarizing the contents of the session based on the report by Mr. Nagae, who is participating in Data + AI SUMMIT2023 (DAIS) on site.

This article introduces the session "The Future is Open: Data Streaming in an Omni-Cloud Reality". This talk was delivered by Christina Taylor, Senior Staff Engineer at Catalyst Software. It introduces data streaming techniques and best practices, BID and infrastructure code best practices, with a focus on openness and interoperability revenue for creating future-ready low-latency data systems.

Data streaming techniques and best practices

Data streaming technology is expected to become more and more important in the future. By understanding and implementing best practices for incremental technology use and a combination of open source and governed technology, you will be able to build future-ready, low-latency data systems. At the same time, he points out the importance of openness and interoperability.

How to use incremental technology

In data streaming, it is said that the use of incremental technology is important. Incremental technology is technology that captures data changes and updates in real time and reflects them in the data stream. This speeds up data processing and enables real-time analysis and decision-making.

Specific incremental technology uses include:

  1. Introducing an event-driven architecture to detect data changes in real time
  2. Realization of real-time data processing using stream processing engines (Apache Kafka, Apache Flink, etc.)
  3. Leveraging Delta Streams to Efficiently Handle Data Updates and Deletes

Best practices for open source and governed technology

Leveraging both open source and governed technologies in data streaming is a best practice. Open source technology has the advantage of being developed and supported by the community, and is highly flexible and extensible. At the same time, managed technology has the advantage of being easier to operate and maintain, and more reliable and secure.

BID and infrastructure code best practices

Discussed BID (Big Data, Internet, Database) and infrastructure code best practices, argued by Christina Taylor as the only way for teams to work faster and scale across multiple environments and platforms .

BID point

  1. Modularization: Dividing your code into reusable modules can speed up development.
  2. Automation: By automating infrastructure build and deployment, you can reduce manual errors and improve efficiency.
  3. Version control: Tracking the history of code changes and being able to revert to previous versions if necessary ensures safety and reliability.
  4. Testing: To ensure the quality of our code, we perform automated tests so that we can quickly address any issues that arise.
  5. Documentation: Create and update appropriate documentation to facilitate code understanding and maintenance.

Summary

By understanding and implementing data streaming technology, BID, and infrastructure code best practices, you can build future-ready, low-latency data systems. You will be able to use this knowledge to improve the competitiveness of your business.

Conclusion

This content based on reports from members on site participating in DAIS sessions. During the DAIS period, articles related to the sessions will be posted on the special site below, so please take a look.

Translated by Johann

https://www.ap-com.co.jp/data_ai_summit-2023/