AWS Batch vs Apache Spark comparison

Cancel
You must select at least 2 products to compare!
Apache Logo
2,893 views|2,256 comparisons
89% willing to recommend
Amazon Web Services (AWS) Logo
6,770 views|6,477 comparisons
100% willing to recommend
Comparison Buyer's Guide
Executive Summary

We performed a comparison between Apache Spark and AWS Batch based on real PeerSpot user reviews.

Find out in this report how the two Compute Service solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI.
To learn more, read our detailed AWS Batch vs. Apache Spark Report (Updated: May 2024).
772,649 professionals have used our research since 2012.
Featured Review
Quotes From Members
We asked business professionals to review the solutions they use.
Here are some excerpts of what they said:
Pros
"Features include machine learning, real time streaming, and data processing.""Its scalability and speed are very valuable. You can scale it a lot. It is a great technology for big data. It is definitely better than a lot of earlier warehouse or pipeline solutions, such as Informatica. Spark SQL is very compliant with normal SQL that we have been using over the years. This makes it easy to code in Spark. It is just like using normal SQL. You can use the APIs of Spark or you can directly write SQL code and run it. This is something that I feel is useful in Spark.""Apache Spark provides a very high-quality implementation of distributed data processing.""I appreciate everything about the solution, not just one or two specific features. The solution is highly stable. I rate it a perfect ten. The solution is highly scalable. I rate it a perfect ten. The initial setup was straightforward. I recommend using the solution. Overall, I rate the solution a perfect ten.""The fault tolerant feature is provided.""The product is useful for analytics.""The solution is scalable.""The good performance. The nice graphical management console. The long list of ML algorithms."

More Apache Spark Pros →

"We can easily integrate AWS container images into the product.""AWS Batch's deployment was easy.""AWS Batch manages the execution of computing workload, including job scheduling, provisioning, and scaling.""There is one other feature in confirmation or call confirmation where you can have templates of what you want to do and just modify those to customize it to your needs. And these templates basically make it a lot easier for you to get started."

More AWS Batch Pros →

Cons
"They could improve the issues related to programming language for the platform.""Technical expertise from an engineer is required to deploy and run high-tech tools, like Informatica, on Apache Spark, making it an area where improvements are required to make the process easier for users.""Dynamic DataFrame options are not yet available.""The solution’s integration with other platforms should be improved.""Apache Spark provides very good performance The tuning phase is still tricky.""It would be beneficial to enhance Spark's capabilities by incorporating models that utilize features not traditionally present in its framework.""It requires overcoming a significant learning curve due to its robust and feature-rich nature.""The management tools could use improvement. Some of the debugging tools need some work as well. They need to be more descriptive."

More Apache Spark Cons →

"AWS Batch needs to improve its documentation.""The main drawback to using AWS Batch would be the cost. It will be more expensive in some cases than using an HPC. It's more amenable to cases where you have spot requirements.""The solution should include better and seamless integration with other AWS services, like Amazon S3 data storage and EC2 compute resources.""When we run a lot of batch jobs, the UI must show the history."

More AWS Batch Cons →

Pricing and Cost Advice
  • "Since we are using the Apache Spark version, not the data bricks version, it is an Apache license version, the support and resolution of the bug are actually late or delayed. The Apache license is free."
  • "Apache Spark is open-source. You have to pay only when you use any bundled product, such as Cloudera."
  • "We are using the free version of the solution."
  • "Apache Spark is not too cheap. You have to pay for hardware and Cloudera licenses. Of course, there is a solution with open source without Cloudera."
  • "Apache Spark is an expensive solution."
  • "Spark is an open-source solution, so there are no licensing costs."
  • "On the cloud model can be expensive as it requires substantial resources for implementation, covering on-premises hardware, memory, and licensing."
  • "It is an open-source solution, it is free of charge."
  • More Apache Spark Pricing and Cost Advice →

  • "AWS Batch's pricing is good."
  • "The pricing is very fair."
  • "AWS Batch is a cheap solution."
  • More AWS Batch Pricing and Cost Advice →

    report
    Use our free recommendation engine to learn which Compute Service solutions are best for your needs.
    772,649 professionals have used our research since 2012.
    Questions from the Community
    Top Answer:We use Spark to process data from different data sources.
    Top Answer:In data analysis, you need to take real-time data from different data sources. You need to process this in a subsecond, and do the transformation in a subsecond
    Top Answer:AWS Lambda is a serverless solution. It doesn’t require any infrastructure, which allows for cost savings. There is no setup process to deal with, as the entire solution is in the cloud. If you use… more »
    Top Answer:AWS Batch manages the execution of computing workload, including job scheduling, provisioning, and scaling.
    Ranking
    5th
    out of 16 in Compute Service
    Views
    2,893
    Comparisons
    2,256
    Reviews
    26
    Average Words per Review
    444
    Rating
    8.7
    4th
    out of 16 in Compute Service
    Views
    6,770
    Comparisons
    6,477
    Reviews
    4
    Average Words per Review
    973
    Rating
    9.0
    Comparisons
    Also Known As
    Amazon Batch
    Learn More
    Overview

    Spark provides programmers with an application programming interface centered on a data structure called the resilient distributed dataset (RDD), a read-only multiset of data items distributed over a cluster of machines, that is maintained in a fault-tolerant way. It was developed in response to limitations in the MapReduce cluster computing paradigm, which forces a particular linear dataflowstructure on distributed programs: MapReduce programs read input data from disk, map a function across the data, reduce the results of the map, and store reduction results on disk. Spark's RDDs function as a working set for distributed programs that offers a (deliberately) restricted form of distributed shared memory

    AWS Batch enables developers, scientists, and engineers to easily and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted. With AWS Batch, there is no need to install and manage batch computing software or server clusters that you use to run your jobs, allowing you to focus on analyzing results and solving problems. AWS Batch plans, schedules, and executes your batch computing workloads across the full range of AWS compute services and features, such as Amazon EC2 and Spot Instances.

    Sample Customers
    NASA JPL, UC Berkeley AMPLab, Amazon, eBay, Yahoo!, UC Santa Cruz, TripAdvisor, Taboola, Agile Lab, Art.com, Baidu, Alibaba Taobao, EURECOM, Hitachi Solutions
    Hess, Expedia, Kelloggs, Philips, HyperTrack
    Top Industries
    REVIEWERS
    Computer Software Company33%
    Financial Services Firm12%
    University9%
    Marketing Services Firm6%
    VISITORS READING REVIEWS
    Financial Services Firm25%
    Computer Software Company13%
    Manufacturing Company7%
    Comms Service Provider5%
    VISITORS READING REVIEWS
    Financial Services Firm26%
    Computer Software Company13%
    Manufacturing Company6%
    Educational Organization5%
    Company Size
    REVIEWERS
    Small Business42%
    Midsize Enterprise16%
    Large Enterprise42%
    VISITORS READING REVIEWS
    Small Business17%
    Midsize Enterprise12%
    Large Enterprise71%
    VISITORS READING REVIEWS
    Small Business16%
    Midsize Enterprise12%
    Large Enterprise72%
    Buyer's Guide
    AWS Batch vs. Apache Spark
    May 2024
    Find out what your peers are saying about AWS Batch vs. Apache Spark and other solutions. Updated: May 2024.
    772,649 professionals have used our research since 2012.

    Apache Spark is ranked 5th in Compute Service with 60 reviews while AWS Batch is ranked 4th in Compute Service with 4 reviews. Apache Spark is rated 8.4, while AWS Batch is rated 9.0. The top reviewer of Apache Spark writes "Reliable, able to expand, and handle large amounts of data well". On the other hand, the top reviewer of AWS Batch writes "User-friendly, good customization and offers exceptional scalability, allowing users to run jobs ranging from 32 cores to over 2,000 cores". Apache Spark is most compared with Spring Boot, Spark SQL, SAP HANA, Cloudera Distribution for Hadoop and Azure Stream Analytics, whereas AWS Batch is most compared with AWS Lambda, AWS Fargate, Oracle Compute Cloud Service, Amazon EC2 Auto Scaling and Amazon EC2. See our AWS Batch vs. Apache Spark report.

    See our list of best Compute Service vendors.

    We monitor all Compute Service reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.