askill
arcgis-dependency-mapper

arcgis-dependency-mapperSafety 95Repository

Crawl and document relationships between AGOL items - which web maps use which layers, which apps use which maps. Use before deleting items, for impact analysis, and content audits.

0 stars
1.2k downloads
Updated 2/8/2026

Package Files

Loading files...
SKILL.md

ArcGIS Dependency Mapper Skill

Purpose

Crawl and document relationships between AGOL items - which web maps use which layers, which apps use which maps, what breaks if you delete something.

When to Use

  • Before deleting or modifying any shared item
  • Auditing content organization
  • Planning migrations or reorganizations
  • Understanding impact of schema changes
  • Documenting project architectures

Prerequisites

  • Authenticated GIS connection (see arcgis-authentication skill)
  • Read access to items you want to analyze

The Problem This Solves

AGOL doesn't show you:

  • "If I delete this Feature Layer, what breaks?"
  • "Which apps are using this web map?"
  • "What's the full dependency tree for this dashboard?"

This skill builds that picture.

Core Dependency Mapping

Map Layer Dependencies

import json

def get_webmap_layers(gis, webmap_item):
    """Extract all layer references from a web map."""
    if webmap_item.type != "Web Map":
        return []
    
    # Get the web map JSON
    webmap_data = webmap_item.get_data()
    
    layers = []
    
    # Operational layers
    for layer in webmap_data.get("operationalLayers", []):
        layers.append({
            "title": layer.get("title"),
            "url": layer.get("url"),
            "itemId": layer.get("itemId"),
            "type": "operational"
        })
    
    # Basemap layers
    basemap = webmap_data.get("baseMap", {})
    for layer in basemap.get("baseMapLayers", []):
        layers.append({
            "title": layer.get("title"),
            "url": layer.get("url"),
            "itemId": layer.get("itemId"),
            "type": "basemap"
        })
    
    # Tables
    for table in webmap_data.get("tables", []):
        layers.append({
            "title": table.get("title"),
            "url": table.get("url"),
            "itemId": table.get("itemId"),
            "type": "table"
        })
    
    return layers

App Dependencies

def get_app_dependencies(gis, app_item):
    """Extract dependencies from web apps (Dashboard, Experience Builder, etc.)."""
    dependencies = []
    
    try:
        app_data = app_item.get_data()
    except:
        return dependencies
    
    if app_item.type == "Dashboard":
        dependencies = parse_dashboard_deps(app_data)
    elif app_item.type == "Web Experience":
        dependencies = parse_experience_deps(app_data)
    elif app_item.type in ["Web Mapping Application", "Web AppBuilder"]:
        dependencies = parse_webapp_deps(app_data)
    
    return dependencies

def parse_dashboard_deps(data):
    """Parse Dashboard JSON for dependencies."""
    deps = []
    
    # Dashboards reference web maps and layers
    if isinstance(data, dict):
        # Recursively search for itemId references
        deps.extend(find_item_references(data))
    
    return deps

def find_item_references(obj, path=""):
    """Recursively find itemId and url references in JSON."""
    refs = []
    
    if isinstance(obj, dict):
        if "itemId" in obj:
            refs.append({
                "itemId": obj["itemId"],
                "path": path,
                "url": obj.get("url")
            })
        if "webmap" in obj:
            refs.append({
                "itemId": obj["webmap"],
                "path": path + ".webmap",
                "type": "webmap"
            })
        
        for key, value in obj.items():
            refs.extend(find_item_references(value, f"{path}.{key}"))
    
    elif isinstance(obj, list):
        for i, item in enumerate(obj):
            refs.extend(find_item_references(item, f"{path}[{i}]"))
    
    return refs

Build Full Dependency Graph

def build_dependency_graph(gis, root_item_id, max_depth=5):
    """
    Build complete dependency graph starting from an item.
    
    Returns dict with:
    - nodes: All items in the graph
    - edges: Dependencies between items
    """
    visited = set()
    nodes = {}
    edges = []
    
    def crawl(item_id, depth=0):
        if depth > max_depth or item_id in visited:
            return
        
        visited.add(item_id)
        item = gis.content.get(item_id)
        
        if item is None:
            nodes[item_id] = {"id": item_id, "title": "NOT FOUND", "type": "missing"}
            return
        
        nodes[item_id] = {
            "id": item_id,
            "title": item.title,
            "type": item.type,
            "owner": item.owner
        }
        
        # Get dependencies based on item type
        deps = []
        if item.type == "Web Map":
            deps = get_webmap_layers(gis, item)
        elif item.type in ["Dashboard", "Web Experience", "Web Mapping Application"]:
            deps = get_app_dependencies(gis, item)
        
        for dep in deps:
            dep_id = dep.get("itemId")
            if dep_id:
                edges.append({
                    "from": item_id,
                    "to": dep_id,
                    "relationship": dep.get("type", "uses")
                })
                crawl(dep_id, depth + 1)
    
    crawl(root_item_id)
    
    return {"nodes": nodes, "edges": edges}

Reverse Dependency Lookup

def find_dependents(gis, item_id, owner=None):
    """
    Find all items that depend on a given item.
    
    This is the "what breaks if I delete this?" query.
    """
    target_item = gis.content.get(item_id)
    if not target_item:
        return []
    
    # Search for items that might reference this one
    query = f"owner:{owner}" if owner else "owner:me"
    candidates = gis.content.search(
        query=query,
        item_type="Web Map",
        max_items=500
    )
    
    # Add apps
    for app_type in ["Dashboard", "Web Experience", "Web Mapping Application"]:
        apps = gis.content.search(query=query, item_type=app_type, max_items=200)
        candidates.extend(apps)
    
    dependents = []
    
    for candidate in candidates:
        if candidate.id == item_id:
            continue
        
        # Check if this candidate references our target
        deps = []
        if candidate.type == "Web Map":
            deps = get_webmap_layers(gis, candidate)
        else:
            deps = get_app_dependencies(gis, candidate)
        
        for dep in deps:
            if dep.get("itemId") == item_id:
                dependents.append({
                    "id": candidate.id,
                    "title": candidate.title,
                    "type": candidate.type,
                    "relationship": dep.get("type", "uses")
                })
                break
    
    return dependents

Impact Analysis

def analyze_deletion_impact(gis, item_id):
    """
    Full impact analysis for deleting an item.
    """
    item = gis.content.get(item_id)
    if not item:
        return {"error": "Item not found"}
    
    print(f"\n=== Deletion Impact Analysis ===")
    print(f"Item: {item.title} ({item.type})")
    print(f"ID: {item_id}")
    print(f"Owner: {item.owner}")
    
    # Find dependents
    dependents = find_dependents(gis, item_id)
    
    if not dependents:
        print("\n✓ No dependents found - safe to delete")
    else:
        print(f"\n⚠ WARNING: {len(dependents)} items depend on this!")
        print("\nDependents that will break:")
        for dep in dependents:
            print(f"  - {dep['title']} ({dep['type']})")
    
    # If it's a Feature Service, check for views
    if item.type == "Feature Service":
        related = item.related_items("Service2Data", "forward")
        related.extend(item.related_items("Service2Service", "forward"))
        if related:
            print(f"\n⚠ Related items (views, etc.):")
            for rel in related:
                print(f"  - {rel.title} ({rel.type})")
    
    return {
        "item": {"id": item_id, "title": item.title, "type": item.type},
        "dependents": dependents,
        "safe_to_delete": len(dependents) == 0
    }

Visualization

def export_graph_to_mermaid(graph):
    """Export dependency graph as Mermaid diagram."""
    lines = ["graph TD"]
    
    # Add nodes
    for node_id, node in graph["nodes"].items():
        safe_id = node_id.replace("-", "_")
        label = f"{node['title']}<br/>{node['type']}"
        lines.append(f'    {safe_id}["{label}"]')
    
    # Add edges
    for edge in graph["edges"]:
        from_id = edge["from"].replace("-", "_")
        to_id = edge["to"].replace("-", "_")
        lines.append(f"    {from_id} --> {to_id}")
    
    return "\n".join(lines)

def export_graph_to_csv(graph):
    """Export as CSV for analysis."""
    import csv
    from io import StringIO
    
    output = StringIO()
    writer = csv.writer(output)
    
    # Nodes
    writer.writerow(["node_id", "title", "type", "owner"])
    for node_id, node in graph["nodes"].items():
        writer.writerow([node_id, node["title"], node["type"], node.get("owner", "")])
    
    writer.writerow([])  # Blank row
    
    # Edges
    writer.writerow(["from", "to", "relationship"])
    for edge in graph["edges"]:
        writer.writerow([edge["from"], edge["to"], edge["relationship"]])
    
    return output.getvalue()

Full Content Audit

def full_org_dependency_audit(gis, owner=None):
    """
    Complete dependency audit for all content.
    Returns orphaned layers, circular dependencies, etc.
    """
    query = f"owner:{owner}" if owner else "owner:me"
    all_items = gis.content.search(query=query, max_items=1000)
    
    # Categorize items
    layers = [i for i in all_items if i.type in ["Feature Service", "Feature Layer", "Map Service"]]
    maps = [i for i in all_items if i.type == "Web Map"]
    apps = [i for i in all_items if i.type in ["Dashboard", "Web Experience", "Web Mapping Application"]]
    
    # Find referenced layers
    referenced_layers = set()
    for wm in maps:
        for layer in get_webmap_layers(gis, wm):
            if layer.get("itemId"):
                referenced_layers.add(layer["itemId"])
    
    for app in apps:
        for dep in get_app_dependencies(gis, app):
            if dep.get("itemId"):
                referenced_layers.add(dep["itemId"])
    
    # Find orphans (layers not used by any map or app)
    orphans = [l for l in layers if l.id not in referenced_layers]
    
    print(f"\n=== Content Audit ===")
    print(f"Total items: {len(all_items)}")
    print(f"Feature Layers/Services: {len(layers)}")
    print(f"Web Maps: {len(maps)}")
    print(f"Apps: {len(apps)}")
    print(f"\nOrphaned layers (not used by any map/app): {len(orphans)}")
    
    for orphan in orphans[:10]:  # Show first 10
        print(f"  - {orphan.title} ({orphan.id})")
    
    if len(orphans) > 10:
        print(f"  ... and {len(orphans) - 10} more")
    
    return {
        "total": len(all_items),
        "layers": len(layers),
        "maps": len(maps),
        "apps": len(apps),
        "orphans": [{"id": o.id, "title": o.title} for o in orphans]
    }

Gotchas & Limitations

  1. Private items: Can only analyze items you have access to
  2. External references: Won't catch layers hosted in other orgs
  3. Performance: Large orgs may take time to crawl - add progress indicators
  4. App JSON complexity: Different app types have different JSON structures
  5. Dynamic references: Some apps build layer URLs dynamically - won't be caught

Red Cross Specific Notes

  • Org URL: https://arc-nhq-gis.maps.arcgis.com
  • Run dependency check before ANY delete during disaster ops
  • Common orphans: test layers, one-off imports, old project data
  • Document dependencies for critical layers (shelters, damage assessment)

Related Skills

  • arcgis-authentication (required)
  • arcgis-content-ops (for item retrieval)

Install

Download ZIP
Requires askill CLI v1.0+

AI Quality Score

82/100Analyzed 2/23/2026

ArcGIS Dependency Mapper skill with substantial Python code for crawling AGOL item relationships. Provides complete dependency mapping, reverse lookup, impact analysis, and visualization functions. Well-structured with clear 'When to Use' section. Minor issues: abrupt ending, mismatched tags, some org-specific notes. High practical value for AGOL administrators.

95
85
90
85
80

Metadata

Licenseunknown
Version-
Updated2/8/2026
Publisherfranzenjb

Tags

ci-cdtesting