We performed a comparison between Azure Data Factory and Denodo based on real PeerSpot user reviews.
Find out in this report how the two Data Integration solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."We have found the bulk load feature very valuable."
"I enjoy the ease of use for the backend JSON generator, the deployment solution, and the template management."
"The most valuable features of Azure Data Factory are the flexibility, ability to move data at scale, and the integrations with different Azure components."
"I like the basic features like the data-based pipelines."
"The data mapping and the ability to systematically derive data are nice features. It worked really well for the solution we had. It is visual, and it did the transformation as we wanted."
"It makes it easy to collect data from different sources."
"From what we have seen so far, the solution seems very stable."
"An excellent tool for pipeline orchestration."
"It is a go-to tool for data virtualization. The virtualization and data catalog are the features of why we chose Denodo."
"The most valuable features are data lineage and the concept of a semantic layer."
"The most valuable aspects of this solution are the short time frame in which you can deliver and connect."
"The data abstraction is the most valuable feature."
"It can support a number of data sources, and it can pull flat files, from cloud-based databases or from those on-premises. Denodo can pull from any data source and interface with the view. Then, we can publish the view."
"The logical data warehouse functionality is fantastic. It truly stands out. The ClearOptimizer and Virtual Cache are great features. They work together seamlessly to optimize performance."
"The best thing about Denodo is that creating and deploying a web service can be done in about 10 minutes, compared to a whole day when it comes to other solutions (such as when deploying with Java and AWS)."
"Denodo's best features are its performance, easy data transformation, and the job scheduler."
"Lacks a decent UI that would give us a view of the kinds of requests that come in."
"The solution can be improved by decreasing the warmup time which currently can take up to five minutes."
"I would like to be informed about the changes ahead of time, so we are aware of what's coming."
"A room for improvement in Azure Data Factory is its speed. Parallelization also needs improvement."
"It does not appear to be as rich as other ETL tools. It has very limited capabilities."
"There is room for improvement primarily in its streaming capabilities. For structured streaming and machine learning model implementation within an ETL process, it lags behind tools like Informatica."
"We have experienced some issues with the integration. This is an area that needs improvement."
"The need to work more on developing out-of-the-box connectors for other products like Oracle, AWS, and others."
"We can't scale it to meet digital requirements."
"I would like to see a proper way to avoid killing the sourcing systems."
"Denodo's training documentation could be improved by providing more material. From an administrative standpoint, I've found that only Denodo websites provide the usual tutorials. It may be because it's a bit of a restricted tool, but it results in trouble with learning. Normally, I can find help and solutions from other sources, but I haven't been able to find any for Denodo. Other that, it's fine and it performs well. I only have six months of experience, so I can't accurately suggest improvements."
"The integration could use improvement, it's a lot of non-speed line processes that we have discovered, in the country. The configurations could use a lot more improvement."
"Denodo has some difficulty supporting large numbers of records."
"It would be beneficial to make sure that the team that will be using Denodo has some kind of training on how to use the product at least a month beforehand, and there could even be some kind of feedback or Q&A sessions to go along with the training. If Denodo were able to provide this kind of training, it would be very helpful to users in insurance and banking companies because the staff are typically older and not always technically-minded."
"The support is not the best and should be improved."
"Performance management could be improved."
Azure Data Factory is ranked 1st in Data Integration with 81 reviews while Denodo is ranked 12th in Data Integration with 29 reviews. Azure Data Factory is rated 8.0, while Denodo is rated 7.8. The top reviewer of Azure Data Factory writes "The data factory agent is quite good but pricing needs to be more transparent". On the other hand, the top reviewer of Denodo writes "Saves our underwriters' time with data virtualization, but could provide more learning resources". Azure Data Factory is most compared with Informatica PowerCenter, Informatica Cloud Data Integration, Alteryx Designer, Snowflake and Oracle GoldenGate, whereas Denodo is most compared with AWS Glue, Mule Anypoint Platform, Delphix, Informatica PowerCenter and Palantir Foundry. See our Azure Data Factory vs. Denodo report.
See our list of best Data Integration vendors.
We monitor all Data Integration reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.
Greetings, Stefan.
Alteryx is basically an ETL tool that evolved to deliver some Data Viz and ML features too. This means that its main purpose is to extract data from different sources, combine and transform them and finally load them in a different database.
Denodo is a data virtualization tool, which means it does all the transformations without extracting from one place and loading to another one. It´s a cloud-based solution and it charges by the traffic. If your company has specific General Data Protection Regulation that prohibits for instance that you extract the data located in a data center in Europe and loading them in a cluster located in the USA, you will probably need a virtualization tool like Denodo instead of an ETL like Alteryx. Virtualization tools are usually more expensive in a long run
Azure Data Factory is a platform meant to leverage the use of Azure. Microsoft´s objective is to sell its cloud solution as a whole. It contains a Data Studio (to manage and control your data), SPARK (which is a Hadoop in memory) and a data lake storage.
As you see, those are 3 different products that do not make much sense to be used together.
I'd say that there is a misconception in some of the answers (but don't worry, it's a common one).
Alteryx is not an ETL tool, it's an analytics platform with very powerful ETL capabilities (accessing mostly all data sources available and processing them at high speeds among others).
But additionally, Alteryx gives you the ability to carry on with the complete analytics cycle, processing, cleaning, blending those diverse data sources, modeling descriptive, predictive, prescriptive analytics (plus some ML & AI), outputting to another humongous variety of data sources, reporting or visualization tools.
All of the previous can be achieved with no coding at all, but in case you want to code, Alteryx also offers Python, R & Scala native integration. In other words, it can solve business users' use cases and advanced/technical use cases at the same time.
Finally, it's a fixed license, with no additional costs per usage (at least so far, until they release the Cloud Version).
I hope I was able to clarify the role of Alteryx in the analytics landscape.