How to calculate max value by group in Pyspark

Aggregation of fields is one of the basic necessity for data analysis and data science. Pyspark provide easy ways to do aggregation and calculate metrics. Finding maximum value for each group can also be achieved while doing the group by. The function that is helpful for finding the maximum value is max(). The below article explains with the help of an example How to calculate Max value by Group in Pyspark.

John has store sales data available for analysis. There are five columns present in the data, Geography (country of store), Department (Industry category of the store), StoreID (Unique ID of each store), Time Period (Month of sales), Revenue (Total Sales for the month). John is looking forward to calculate maximum revenue for each stores. As there are 4 months of data available for each store, there will be one maximum value out of the four.

How to calculate max value by group in Pyspark

Find the maximum sales for each store in Pandas

  • Step 1: Firstly, Import all the necessary modules.
import pandas as pd
import findspark
findspark.init()
import pyspark
from pyspark import SparkContext
from pyspark.sql import SQLContext 
sc = SparkContext("local", "App Name")
sql = SQLContext(sc)
  • Step 2: Then, use max() function along with groupby operation. As we are looking forward to group by each StoreID, “StoreID” works as groupby parameter. The Revenue field contains the sales of each store. To find the maximum value, we will be using “Revenue” for maximum value calculation. For the current example, syntax is:
df1.groupBy("StoreID").agg({'Revenue':'max'}).show()
How to calculate max value by group in Pyspark

Example 2: Calculate Maximum value for each Department

  • Here we are looking forward to calculate the maximum value across each department. So, the field in groupby operation will be “Department”
df1.groupBy("Department").agg({'Revenue':'max'}).show()
How to calculate max value by group in Pyspark

Thus, John is able to calculate value per his requirement in Pyspark. This kind of extraction can be a requirement in many scenarios and use cases. This example talks about one of the use case.

To get top certifications in Pyspark and build your resume visit here. Additionally, you can read books listed here to build strong knowledge around Pyspark. 

Visit us below for video tutorial:

📬 Stay Ahead in Data Science & AI – Subscribe to Newsletter!

  • 🎯 Interview Series: Curated questions and answers for freshers and experienced candidates.
  • 📊 Data Science for All: Simplified articles on key concepts, accessible to all levels.
  • 🤖 Generative AI for All: Easy explanations on Generative AI trends transforming industries.

💡 Why Subscribe? Gain expert insights, stay ahead of trends, and prepare with confidence for your next interview.

👉 Subscribe here:

Related Posts