ā€œA split feature image. The left side shows an orange coffee mug with the word 'Monday'. The right side shows white Google Cloud icons on a blue background, including a GCP cloud symbol, analytics chart, upward savings arrow, and a cleanup trash bin, representing automated cloud optimization from Active Assist.

GCP Active Assist: Automated Cloud Optimization Recommendations

The Problem

Your GCP environment is bleeding money on idle VMs, oversized instances, and uncommitted use discounts you’re not taking advantage of. Your IAM is a mess with over-permissioned service accounts that violate least privilege. You have abandoned projects consuming resources that nobody remembers creating. Without manually analysing metrics, reviewing IAM, and hunting through billing data, you have no visibility into optimization opportunities – and you definitely don’t have time to do that across hundreds of projects.

The Solution

GCP Active Assist uses machine learning to continuously analyze your environment and surface actionable recommendations for cost, security, performance, reliability, manageability, and sustainability. It’s already running – recommendations appear in the Recommendation Hub and via API. Query recommendations programmatically, export to BigQuery for trending, and automate remediation for trusted recommendation types. Unlike Azure Advisor or AWS Trusted Advisor, Active Assist is free and provides extremely granular recommendations at the individual resource level.

Essential Active Assist Implementations

1. Recommendation Export and Analysis Script

#!/usr/bin/env python3
"""
Export all Active Assist recommendations across projects
Prioritize by cost impact and generate remediation scripts
"""

from google.cloud import recommender_v1
from google.cloud import resourcemanager_v3
import csv
from datetime import datetime

def get_all_projects():
    """Get all active projects in organization"""
    client = resourcemanager_v3.ProjectsClient()
    request = resourcemanager_v3.SearchProjectsRequest(
        query="state:ACTIVE"
    )
    
    projects = []
    for project in client.search_projects(request=request):
        projects.append({
            'project_id': project.project_id,
            'name': project.display_name
        })
    
    return projects

def get_recommendations(project_id, location='global'):
    """Get all recommendations for a project"""
    client = recommender_v1.RecommenderClient()
    
    # List of recommenders to query
    recommenders = [
        'google.compute.instance.MachineTypeRecommender',
        'google.compute.instance.IdleResourceRecommender',
        'google.compute.disk.IdleResourceRecommender',
        'google.compute.address.IdleResourceRecommender',
        'google.compute.image.IdleResourceRecommender',
        'google.iam.policy.Recommender',
        'google.compute.commitment.UsageCommitmentRecommender',
        'google.cloudsql.instance.IdleRecommender',
        'google.cloudsql.instance.OverprovisionedRecommender',
        'google.logging.productSuggestion.ContainerRecommender',
        'google.resourcemanager.projectUtilization.Recommender'
    ]
    
    all_recommendations = []
    
    for recommender_id in recommenders:
        parent = f"projects/{project_id}/locations/{location}/recommenders/{recommender_id}"
        
        try:
            recommendations = client.list_recommendations(parent=parent)
            
            for rec in recommendations:
                # Extract cost impact if available
                cost_impact = 0
                if rec.primary_impact and rec.primary_impact.cost_projection:
                    cost_impact = abs(rec.primary_impact.cost_projection.cost.units)
                
                all_recommendations.append({
                    'project_id': project_id,
                    'recommender': recommender_id.split('.')[-1],
                    'name': rec.name,
                    'description': rec.description,
                    'priority': rec.priority.name,
                    'state': rec.state.name,
                    'cost_impact_monthly': cost_impact,
                    'resource': rec.content.get('overview', {}).get('resource', ''),
                    'last_refresh': rec.last_refresh_time.strftime('%Y-%m-%d %H:%M:%S')
                })
        except Exception as e:
            print(f"Error getting recommendations for {recommender_id} in {project_id}: {str(e)}")
            continue
    
    return all_recommendations

def export_recommendations_to_csv(output_file='active-assist-recommendations.csv'):
    """Export all recommendations to CSV"""
    
    print("šŸ” Discovering projects...")
    projects = get_all_projects()
    print(f"Found {len(projects)} active projects")
    
    all_recommendations = []
    
    for project in projects:
        print(f"šŸ“Š Analyzing {project['project_id']}...")
        recommendations = get_recommendations(project['project_id'])
        all_recommendations.extend(recommendations)
        print(f"  Found {len(recommendations)} recommendations")
    
    # Sort by cost impact
    all_recommendations.sort(key=lambda x: x['cost_impact_monthly'], reverse=True)
    
    # Export to CSV
    if all_recommendations:
        with open(output_file, 'w', newline='') as f:
            writer = csv.DictWriter(f, fieldnames=all_recommendations[0].keys())
            writer.writeheader()
            writer.writerows(all_recommendations)
        
        print(f"\nāœ… Exported {len(all_recommendations)} recommendations to {output_file}")
        
        # Summary statistics
        total_savings = sum(r['cost_impact_monthly'] for r in all_recommendations)
        high_priority = len([r for r in all_recommendations if r['priority'] == 'P1'])
        
        print(f"\nšŸ’° Total monthly savings potential: ${total_savings:,.2f}")
        print(f"šŸ”„ High priority recommendations: {high_priority}")
        
        # Top 10 by savings
        print("\nšŸ“ˆ Top 10 recommendations by cost impact:")
        for i, rec in enumerate(all_recommendations[:10], 1):
            print(f"  {i}. [{rec['project_id']}] {rec['recommender']}: ${rec['cost_impact_monthly']:.2f}/month")
            print(f"     {rec['description'][:100]}...")
    else:
        print("āš ļø  No recommendations found")

if __name__ == '__main__':
    export_recommendations_to_csv()

2. Auto-Apply Trusted Recommendations

#!/usr/bin/env python3
"""
Automatically apply trusted Active Assist recommendations
Currently handles: idle resource deletion, IAM permission removal
Requires approval workflow for production projects
"""

from google.cloud import recommender_v1
from google.cloud import compute_v1
from google.cloud import iam_v1
import time

# Configuration
DRY_RUN = True  # Set to False to actually apply changes
PRODUCTION_PROJECTS = ['prod-project-1', 'prod-project-2']  # Require approval
AUTO_APPLY_RECOMMENDERS = [
    'google.compute.disk.IdleResourceRecommender',
    'google.compute.address.IdleResourceRecommender',
    'google.compute.image.IdleResourceRecommender'
]

def apply_idle_disk_recommendation(project_id, recommendation):
    """Delete idle disk"""
    
    # Parse resource name from recommendation
    resource_info = recommendation.content.get('overview', {})
    disk_name = resource_info.get('resource', '').split('/')[-1]
    zone = resource_info.get('resource', '').split('/')[8]
    
    if not disk_name or not zone:
        print(f"  āŒ Could not parse disk info from recommendation")
        return False
    
    print(f"  šŸ—‘ļø  Deleting idle disk: {disk_name} in {zone}")
    
    if DRY_RUN:
        print(f"  āš ļø  DRY RUN: Would delete disk {disk_name}")
        return True
    
    try:
        compute = compute_v1.DisksClient()
        operation = compute.delete(
            project=project_id,
            zone=zone,
            disk=disk_name
        )
        
        # Wait for operation to complete
        operation.result(timeout=120)
        print(f"  āœ… Disk {disk_name} deleted successfully")
        return True
        
    except Exception as e:
        print(f"  āŒ Error deleting disk: {str(e)}")
        return False

def apply_idle_address_recommendation(project_id, recommendation):
    """Release idle IP address"""
    
    resource_info = recommendation.content.get('overview', {})
    address_name = resource_info.get('resource', '').split('/')[-1]
    region = resource_info.get('resource', '').split('/')[8]
    
    if not address_name or not region:
        print(f"  āŒ Could not parse address info")
        return False
    
    print(f"  šŸ—‘ļø  Releasing idle address: {address_name} in {region}")
    
    if DRY_RUN:
        print(f"  āš ļø  DRY RUN: Would release address {address_name}")
        return True
    
    try:
        compute = compute_v1.AddressesClient()
        operation = compute.delete(
            project=project_id,
            region=region,
            address=address_name
        )
        
        operation.result(timeout=120)
        print(f"  āœ… Address {address_name} released successfully")
        return True
        
    except Exception as e:
        print(f"  āŒ Error releasing address: {str(e)}")
        return False

def mark_recommendation_complete(project_id, recommendation_name, recommender_id):
    """Mark recommendation as applied"""
    
    client = recommender_v1.RecommenderClient()
    
    request = recommender_v1.MarkRecommendationSucceededRequest(
        name=recommendation_name,
        etag=recommendation_name.split('/')[-1]  # Simplified
    )
    
    try:
        client.mark_recommendation_succeeded(request=request)
        print(f"  āœ… Marked recommendation as succeeded")
        return True
    except Exception as e:
        print(f"  āš ļø  Could not mark as succeeded: {str(e)}")
        return False

def process_recommendations(project_id):
    """Process all auto-apply recommendations for a project"""
    
    client = recommender_v1.RecommenderClient()
    
    print(f"\nšŸ“Š Processing project: {project_id}")
    
    # Check if production project
    is_production = project_id in PRODUCTION_PROJECTS
    if is_production:
        print(f"  āš ļø  PRODUCTION PROJECT - skipping auto-apply")
        return
    
    applied_count = 0
    
    for recommender_id in AUTO_APPLY_RECOMMENDERS:
        parent = f"projects/{project_id}/locations/global/recommenders/{recommender_id}"
        
        try:
            recommendations = client.list_recommendations(parent=parent)
            
            for rec in recommendations:
                if rec.state.name != 'ACTIVE':
                    continue
                
                print(f"\n  šŸ“‹ Recommendation: {rec.description}")
                print(f"     Priority: {rec.priority.name}")
                
                # Apply based on recommender type
                success = False
                if 'IdleResourceRecommender' in recommender_id:
                    if 'disk' in recommender_id.lower():
                        success = apply_idle_disk_recommendation(project_id, rec)
                    elif 'address' in recommender_id.lower():
                        success = apply_idle_address_recommendation(project_id, rec)
                
                if success:
                    mark_recommendation_complete(project_id, rec.name, recommender_id)
                    applied_count += 1
                
                time.sleep(2)  # Rate limiting
                
        except Exception as e:
            print(f"  āŒ Error processing {recommender_id}: {str(e)}")
            continue
    
    print(f"\nāœ… Applied {applied_count} recommendations for {project_id}")

if __name__ == '__main__':
    # Process all projects
    from google.cloud import resourcemanager_v3
    
    client = resourcemanager_v3.ProjectsClient()
    request = resourcemanager_v3.SearchProjectsRequest(query="state:ACTIVE")
    
    mode = "DRY RUN" if DRY_RUN else "LIVE"
    print(f"šŸš€ Starting auto-apply in {mode} mode\n")
    
    for project in client.search_projects(request=request):
        process_recommendations(project.project_id)
    
    print("\nāœ… Processing complete")

3. BigQuery Export and Trending Analysis

-- Set up BigQuery export (run once in Cloud Console)
-- Recommendation Hub > Settings > BigQuery Export > Enable

-- Query 1: Monthly cost savings opportunity trend
WITH monthly_recommendations AS (
  SELECT
    DATE_TRUNC(recommendation_last_refresh_time, MONTH) as month,
    recommender_subtype,
    COUNT(*) as recommendation_count,
    SUM(CAST(JSON_EXTRACT_SCALAR(primary_impact, '$.cost_projection.cost.units') AS FLOAT64)) as total_savings
  FROM `project.dataset.recommendations`
  WHERE state = 'ACTIVE'
    AND priority = 'P1'
  GROUP BY month, recommender_subtype
)
SELECT
  month,
  recommender_subtype,
  recommendation_count,
  ROUND(total_savings, 2) as monthly_savings_usd
FROM monthly_recommendations
ORDER BY month DESC, total_savings DESC;

-- Query 2: Recommendations by project with savings potential
SELECT
  cloud_entity_id as project_id,
  recommender_subtype,
  COUNT(*) as recommendation_count,
  ROUND(SUM(CAST(JSON_EXTRACT_SCALAR(primary_impact, '$.cost_projection.cost.units') AS FLOAT64)), 2) as potential_monthly_savings
FROM `project.dataset.recommendations`
WHERE state = 'ACTIVE'
  AND recommendation_last_refresh_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
GROUP BY project_id, recommender_subtype
HAVING potential_monthly_savings > 0
ORDER BY potential_monthly_savings DESC
LIMIT 20;

-- Query 3: Recommendation resolution rate
SELECT
  recommender_subtype,
  COUNT(*) as total_recommendations,
  COUNTIF(state = 'SUCCEEDED') as resolved,
  COUNTIF(state = 'DISMISSED') as dismissed,
  COUNTIF(state = 'ACTIVE') as still_active,
  ROUND(COUNTIF(state = 'SUCCEEDED') / COUNT(*) * 100, 1) as resolution_rate_pct
FROM `project.dataset.recommendations`
WHERE recommendation_last_refresh_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY)
GROUP BY recommender_subtype
ORDER BY total_recommendations DESC;

-- Query 4: Unattended projects (security risk)
SELECT
  cloud_entity_id as project_id,
  description,
  recommendation_last_refresh_time,
  JSON_EXTRACT_SCALAR(content, '$.overview.lastActivityTime') as last_activity
FROM `project.dataset.recommendations`
WHERE recommender_subtype = 'UNATTENDED_PROJECT_RECOMMENDER'
  AND state = 'ACTIVE'
ORDER BY recommendation_last_refresh_time DESC;

-- Query 5: IAM over-permissions by project
SELECT
  cloud_entity_id as project_id,
  COUNT(*) as overpermission_count,
  STRING_AGG(DISTINCT JSON_EXTRACT_SCALAR(content, '$.overview.member'), ', ') as affected_members
FROM `project.dataset.recommendations`
WHERE recommender = 'google.iam.policy.Recommender'
  AND state = 'ACTIVE'
  AND priority = 'P1'
GROUP BY project_id
ORDER BY overpermission_count DESC
LIMIT 20;

4. Slack Alert Integration

#!/usr/bin/env python3
"""
Send daily Active Assist summary to Slack
Alerts on high-priority recommendations and new cost-saving opportunities
"""

import requests
import json
from google.cloud import recommender_v1
from datetime import datetime, timedelta

SLACK_WEBHOOK_URL = "https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
ALERT_THRESHOLD = 100  # Alert if potential monthly savings > $100

def get_high_priority_recommendations(project_id):
    """Get all P1 recommendations for a project"""
    
    client = recommender_v1.RecommenderClient()
    
    recommenders = [
        'google.compute.instance.MachineTypeRecommender',
        'google.compute.instance.IdleResourceRecommender',
        'google.iam.policy.Recommender',
        'google.resourcemanager.projectUtilization.Recommender'
    ]
    
    high_priority = []
    
    for recommender_id in recommenders:
        parent = f"projects/{project_id}/locations/global/recommenders/{recommender_id}"
        
        try:
            recommendations = client.list_recommendations(parent=parent)
            
            for rec in recommendations:
                if rec.priority.name == 'P1' and rec.state.name == 'ACTIVE':
                    cost_impact = 0
                    if rec.primary_impact and rec.primary_impact.cost_projection:
                        cost_impact = abs(rec.primary_impact.cost_projection.cost.units)
                    
                    high_priority.append({
                        'project': project_id,
                        'type': recommender_id.split('.')[-1],
                        'description': rec.description,
                        'savings': cost_impact
                    })
        except:
            continue
    
    return high_priority

def send_slack_alert(recommendations):
    """Send formatted Slack message"""
    
    if not recommendations:
        print("No high-priority recommendations to alert on")
        return
    
    total_savings = sum(r['savings'] for r in recommendations)
    
    # Build Slack message
    message = {
        "blocks": [
            {
                "type": "header",
                "text": {
                    "type": "plain_text",
                    "text": "šŸŽÆ Active Assist Daily Summary"
                }
            },
            {
                "type": "section",
                "text": {
                    "type": "mrkdwn",
                    "text": f"*{len(recommendations)}* high-priority recommendations\n*${total_savings:,.2f}* monthly savings potential"
                }
            },
            {
                "type": "divider"
            }
        ]
    }
    
    # Add top 5 recommendations
    for rec in sorted(recommendations, key=lambda x: x['savings'], reverse=True)[:5]:
        message["blocks"].append({
            "type": "section",
            "text": {
                "type": "mrkdwn",
                "text": f"*{rec['project']}* - {rec['type']}\n{rec['description']}\nšŸ’° Save ${rec['savings']:.2f}/month"
            }
        })
    
    # Send to Slack
    response = requests.post(
        SLACK_WEBHOOK_URL,
        data=json.dumps(message),
        headers={'Content-Type': 'application/json'}
    )
    
    if response.status_code == 200:
        print("āœ… Slack alert sent successfully")
    else:
        print(f"āŒ Slack alert failed: {response.status_code}")

if __name__ == '__main__':
    # Get recommendations across all projects
    from google.cloud import resourcemanager_v3
    
    client = resourcemanager_v3.ProjectsClient()
    request = resourcemanager_v3.SearchProjectsRequest(query="state:ACTIVE")
    
    all_recommendations = []
    
    for project in client.search_projects(request=request):
        recs = get_high_priority_recommendations(project.project_id)
        all_recommendations.extend(recs)
    
    # Only alert if total savings exceed threshold
    total_savings = sum(r['savings'] for r in all_recommendations)
    
    if total_savings >= ALERT_THRESHOLD:
        send_slack_alert(all_recommendations)
    else:
        print(f"No alert: Total savings (${total_savings:.2f}) below threshold (${ALERT_THRESHOLD})")

5. gcloud Command Line Operations

#!/bin/bash
# Active Assist operations via gcloud CLI

PROJECT_ID="your-project-id"
LOCATION="global"

# List all recommenders available
echo "šŸ“‹ Available recommenders:"
gcloud recommender recommenders list --location=$LOCATION

# Get idle VM recommendations
echo -e "\nšŸ’¤ Idle VM recommendations:"
gcloud recommender recommendations list \
    --project=$PROJECT_ID \
    --location=$LOCATION \
    --recommender=google.compute.instance.IdleResourceRecommender \
    --format="table(name.basename(),description,primaryImpact.costProjection.cost.units)"

# Get VM rightsizing recommendations
echo -e "\nšŸ“Š VM rightsizing recommendations:"
gcloud recommender recommendations list \
    --project=$PROJECT_ID \
    --location=$LOCATION \
    --recommender=google.compute.instance.MachineTypeRecommender \
    --format="table(name.basename(),description,primaryImpact.costProjection.cost.units)"

# Get IAM recommendations (remove over-permissions)
echo -e "\nšŸ” IAM over-permission recommendations:"
gcloud recommender recommendations list \
    --project=$PROJECT_ID \
    --location=$LOCATION \
    --recommender=google.iam.policy.Recommender \
    --format="table(name.basename(),description,priority)"

# Mark recommendation as succeeded (after manual remediation)
# RECOMMENDATION_ID="projects/123/locations/global/recommenders/google.compute.instance.IdleResourceRecommender/recommendations/abc123"
# gcloud recommender recommendations mark-succeeded $RECOMMENDATION_ID \
#     --location=$LOCATION \
#     --etag=abc123

# Dismiss recommendation
# gcloud recommender recommendations mark-dismissed $RECOMMENDATION_ID \
#     --location=$LOCATION

# Export to JSON for processing
gcloud recommender recommendations list \
    --project=$PROJECT_ID \
    --location=$LOCATION \
    --recommender=google.compute.instance.IdleResourceRecommender \
    --format=json > idle-vms.json

echo -e "\nāœ… Recommendations exported to idle-vms.json"

Why It Matters

  • Zero configuration: Active Assist is already running and analyzing your environment
  • ML-powered: Google’s machine learning analyzes usage patterns over time, not just point-in-time snapshots
  • Free for all recommendations: Unlike AWS Trusted Advisor (requires Business+ support) or paid monitoring tools
  • Granular resource-level: Recommendations target specific VMs, disks, IPs – not just generic advice
  • API-first: Full API access enables automation and integration with existing workflows
  • Multi-dimensional: Cost, security, performance, reliability, sustainability in one place

Try This Week

  1. Export all recommendations – Run the Python export script to see everything Active Assist has found
  2. Query via gcloud – Use the bash script to check idle VMs and rightsizing opportunities
  3. Enable BigQuery export – Turn on export in Recommendation Hub settings for trending
  4. Calculate ROI – Sort by cost_impact_monthly to see your top 10 savings opportunities
  5. Apply 5 recommendations – Manually apply your safest wins (idle disks, unused IPs)

Common Active Assist Mistakes

  • Not enabling BigQuery export: You lose historical trending and can’t build dashboards
  • Ignoring IAM recommendations: Over-permissioned accounts are security incidents waiting to happen
  • Dismissing without investigation: If you dismiss, document why – you’ll forget in 6 months
  • Only looking at cost: Security recommendations often have bigger business impact than savings
  • Not automating trusted recommendations: Idle resources can be safely auto-deleted
  • Forgetting to check unattended projects: Abandoned projects are attack vectors

Advanced Patterns

Organization-wide dashboarding: Export recommendations from all projects to BigQuery, build Data Studio dashboards

Automated approval workflows: Route high-value recommendations through ServiceNow/Jira for approval before auto-apply

Policy enforcement: Use Organization Policy to require remediation of P1 recommendations within 30 days

Cost allocation: Tag resources with recommendation type when applying to track which recommendations drive most savings

Multi-cloud comparison: Compare Active Assist findings with AWS Trusted Advisor and Azure Advisor for consistency

Pro Tip

Start by exporting all recommendations to CSV and sorting by cost_impact_monthly. The top 20 recommendations usually represent 80% of your total savings opportunity. Focus there first – you’ll get quick wins that demonstrate value and build momentum for broader adoption. Idle VMs typically show $50-200/month each, while commitment recommendations can be $5,000-20,000/month.