The Attunity Blog
As we headed into 2016, Information Week had ten predictions for Big Data. Their list included the rise of the Chief Data Officer, the coming of the data-as-a-service business model, and the ability to get real-time insights from data. Looking back at 2016, we saw many of these predictions come true with what some of Attunity’s customers did with their Big Data.
The story of how data became big starts many years before the current buzz around big data. Already seventy years ago we encounter the first attempts to quantify the growth rate in the volume of data or what has popularly been known as the “information explosion” (a term first used in 1941, according to the Oxford English Dictionary). The following are the major milestones in the history of sizing data volumes plus other “firsts” in the evolution of the idea of “big data” and observations pertaining to data or information explosion.
North Bridge, a growth equity and venture capital firm, in partnership with research analyst firm Wikibon, announced the results of its sixth annual Future of Cloud Computing Survey, which analyzes trends in cloud computing, adoption, use and challenges on a yearly basis. North Bridge, a growth equity and venture capital firm, in partnership with research analyst firm Wikibon, announced the results of its sixth annual Future of Cloud Computing Survey, which analyzes trends in cloud computing, adoption, use and challenges on a yearly basis.
Itamar Ankorion, CMO of Attunity Inc., spoke to Jeff Frick (@JeffFrick), host of theCUBE*, from the SiliconANGLE Media team, during AWS re:Invent about customers’ struggles to manage their sprawling data. (*Disclosure below) He said while it was easy enough for Attunity to help them migrate some data to Amazon’s Redshift, customers needed to set up actual data centers.
Today, Attunity is thrilled to announce the availability of Attunity Compose for Amazon Redshift, a new and innovative solution for automating and accelerating data warehousing and ETL in the AWS cloud!
“The mainframe is going away” is as true now as it was 10, 20 and 30 years ago. Mainframes are still crucial in handling critical business transactions, they were however built for an era where batch data movement was the norm and can be difficult to integrate into today’s data-driven, real-time, analytics-focused business processes as well as the environments that support them.
When a giant online sports apparel and memorabilia retailer wanted to move their data into a data lake in the cloud, they needed a seamless method to migrate their data. Read on to learn how Fanatics, the world's most-licensed sports retailer, used Amazon Web Services and Attunity CloudBeam to move data into a cloud-based data lake to perform data analytics.
Mercedes-Benz are renowned the world over for well-engineered, safe, reliable cars designed to maximise driver experience with style, comfort and easy to use controls.
Their business is underpinned and run by SAP. Whilst more than adequate at helping their business “run better”, the world of SAP sometimes leads to its own nuances and complexities, especially as the database grows. With as many as twenty independent Dev systems supporting a continuous stream of new development projects, one can start to appreciate the Mercedes-Benz need for a slick Test Data Management solution.
Data warehouses form the foundation for many Big Data initiatives, helping guide management decisions to operate effectively, serve customers better and realize a competitive advantage. So, it’s important to manage data efficiently so that you’ve got the right data in the right place at the right time. HPE, Hortonworks and Attunity have teamed up to do just that with a solution to help you offload your data warehouse and optimize your data architecture with Hadoop®.
This year, North Bridge, an inception-to-growth stage venture capital firm and Wikibon, a research analyst firm, along with Attunity and 40+ cloud technology firms are collaborating on the 6th annual Future of Cloud Computing Survey.