add sync functionality for API keys and update README and schema

This commit is contained in:
2025-08-20 18:10:44 +05:30
parent e917de2718
commit 7e6b34b9e8
3 changed files with 136 additions and 24 deletions

View File

@@ -63,11 +63,11 @@ This script automates the creation and deletion of Gemini API keys across all Go
## Usage ## Usage
The script has two main actions: `create` and `delete`. The script has three main actions: `create`, `delete`, and `sync`.
### Creating API Keys ### Creating API Keys
To create Gemini API keys for all users listed in `emails.txt`: To create a new "Gemini API Key" in each project that doesn't already have one:
```bash ```bash
uv run main.py create uv run main.py create
@@ -91,26 +91,46 @@ uv run main.py delete --email user1@example.com
**Note**: The `--email` argument is required for the `delete` action for safety. **Note**: The `--email` argument is required for the `delete` action for safety.
### Synchronizing API Keys
To synchronize the local database with the state of keys in Google Cloud for all users in `emails.txt`:
```bash
uv run main.py sync
```
You can also run it for a single email:
```bash
uv run main.py sync --email user1@example.com
```
The `sync` action will:
- Add any keys that exist in the cloud but not locally to the database.
- Mark any keys that exist locally but not in the cloud as `INACTIVE`.
- Report any keys that are correctly synchronized.
### Dry Run ### Dry Run
To see what the script *would* do without making any actual changes to your Google Cloud resources, use the `--dry-run` flag. To see what the script *would* do without making any actual changes to your Google Cloud resources, use the `--dry-run` flag with any action.
```bash ```bash
uv run main.py create --dry-run uv run main.py create --dry-run
uv run main.py sync --email user1@example.com --dry-run
uv run main.py delete --email user1@example.com --dry-run uv run main.py delete --email user1@example.com --dry-run
``` ```
## Output ## Output
- **Logs**: A detailed log file is created in the `logs/` directory for each run, named with a UTC timestamp (e.g., `gemini_key_management_2023-10-27T12-30-00.log`). - **Logs**: A detailed log file is created in the `logs/` directory for each run, named with a UTC timestamp (e.g., `gemini_key_management_2023-10-27T12-30-00.log`).
- **Database**: The `api_keys_database.json` file is created or updated after each successful run. This file contains a structured record of the accounts processed, the projects found, and the API keys created by the script. - **Database**: The `api_keys_database.json` file is created or updated after each successful run. This file contains a structured record of the accounts processed, the projects found, and the API keys managed by the script.
## How it Works ## How it Works
1. **Authentication**: For each email, the script looks for a corresponding `[email].json` token file in the `credentials/` directory. If found and valid, it uses it. If not, it initiates the OAuth 2.0 flow. 1. **Authentication**: For each email, the script looks for a corresponding `[email].json` token file in the `credentials/` directory. If found and valid, it uses it. If not, it initiates the OAuth 2.0 flow.
2. **Project Discovery**: It uses the Google Cloud Resource Manager API to find all projects the authenticated user has access to. 2. **Project Discovery**: It uses the Google Cloud Resource Manager API to find all projects the authenticated user has access to.
3. **API Enablement**: For each project, it checks if the "Generative Language API" (`generativelanguage.googleapis.com`) is enabled. If not, it attempts to enable it. 3. **API Enablement**: For each project, it checks if the "Generative Language API" (`generativelanguage.googleapis.com`) is enabled. If not, it attempts to enable it (during `create` actions).
4. **Key Creation/Deletion**: 4. **Key Management Actions**:
- **Create**: It checks if a key named "Gemini API Key" already exists. If not, it creates a new key using the API Keys API. The key is restricted to only be able to call the `generativellanguage.googleapis.com` service. - **Create**: It checks if a key named "Gemini API Key" already exists. If not, it creates a new key using the API Keys API. The key is restricted to only be able to call the `generativelanguage.googleapis.com` service.
- **Delete**: It finds all keys with the display name "Gemini API Key" and deletes them. - **Delete**: It finds all keys with the display name "Gemini API Key" and deletes them.
5. **Database Update**: The script records the details of any created keys in the `api_keys_database.json` file. When keys are deleted, they are removed from this database. - **Sync**: It compares the keys present in each cloud project with the keys listed in the local `api_keys_database.json`. It adds cloud-only keys to the local database, and marks local-only keys as `INACTIVE`.
5. **Database Update**: The script records the details of any created or synced keys in the `api_keys_database.json` file. When keys are deleted, they are removed from this database. During a sync, keys that no longer exist in the cloud are marked as `INACTIVE`.

121
main.py
View File

@@ -160,6 +160,8 @@ def main():
parser.add_argument("--dry-run", action="store_true", help="Simulate the run without making any actual changes to Google Cloud resources.") parser.add_argument("--dry-run", action="store_true", help="Simulate the run without making any actual changes to Google Cloud resources.")
args = parser.parse_args() args = parser.parse_args()
logging.info(f"Program arguments: {vars(args)}")
if args.action == 'delete' and not args.email: if args.action == 'delete' and not args.email:
parser.error("the --email argument is required for the 'delete' action") parser.error("the --email argument is required for the 'delete' action")
@@ -192,13 +194,116 @@ def main():
if not args.dry_run: if not args.dry_run:
save_keys_to_json(api_keys_data, API_KEYS_DATABASE_FILE, schema) save_keys_to_json(api_keys_data, API_KEYS_DATABASE_FILE, schema)
def sync_project_keys(project, creds, dry_run, db_lock, account_entry):
"""Synchronizes API keys between Google Cloud and the local database for a single project.
Returns True if a Gemini API key exists in the project, False otherwise."""
# Helper class to create a mock key object compatible with add_key_to_database
class TempKey:
def __init__(self, cloud_key, key_string):
self.key_string = key_string
self.uid = cloud_key.uid
self.name = cloud_key.name
self.display_name = cloud_key.display_name
self.create_time = cloud_key.create_time
self.update_time = cloud_key.update_time
self.restrictions = cloud_key.restrictions
project_id = project.project_id
logging.info(f" Synchronizing keys for project {project_id}")
gemini_key_exists = False
try:
api_keys_client = api_keys_v2.ApiKeysClient(credentials=creds)
parent = f"projects/{project_id}/locations/global"
# 1. Fetch cloud keys
cloud_keys_list = list(api_keys_client.list_keys(parent=parent))
for key in cloud_keys_list:
if key.display_name in ["Gemini API Key", "Generative Language API Key"]:
gemini_key_exists = True
cloud_keys = {key.uid: key for key in cloud_keys_list}
# 2. Fetch local keys
project_entry = next((p for p in account_entry["projects"] if p.get("project_info", {}).get("project_id") == project_id), None)
if not project_entry:
# If project is not in DB, create it.
project_entry = {
"project_info": {
"project_id": project.project_id,
"project_name": project.display_name,
"project_number": project.name.split('/')[-1],
"state": str(project.state)
},
"api_keys": []
}
with db_lock:
account_entry["projects"].append(project_entry)
local_keys = {key['key_details']['key_id']: key for key in project_entry.get('api_keys', [])}
# 3. Reconcile
cloud_uids = set(cloud_keys.keys())
local_uids = set(local_keys.keys())
synced_uids = cloud_uids.intersection(local_uids)
cloud_only_uids = cloud_uids - local_uids
local_only_uids = local_uids - cloud_uids
# 4. Process
for uid in synced_uids:
logging.info(f" Key {uid} is synchronized.")
for uid in cloud_only_uids:
key_object = cloud_keys[uid]
logging.info(f" Key {uid} ({key_object.display_name}) found in cloud only. Adding to local database.")
if dry_run:
logging.info(f" [DRY RUN] Would fetch key string for {uid} and add to database.")
continue
try:
# The Key object from list_keys doesn't have key_string, so we fetch it.
key_string_response = api_keys_client.get_key_string(name=key_object.name)
hydrated_key = TempKey(key_object, key_string_response.key_string)
with db_lock:
add_key_to_database(account_entry, project, hydrated_key)
except google_exceptions.PermissionDenied:
logging.warning(f" Permission denied to get key string for {uid}. Skipping.")
except google_exceptions.GoogleAPICallError as err:
logging.error(f" Error getting key string for {uid}: {err}")
for uid in local_only_uids:
logging.info(f" Key {uid} found in local database only. Marking as INACTIVE.")
if dry_run:
logging.info(f" [DRY RUN] Would mark key {uid} as INACTIVE.")
continue
with db_lock:
local_keys[uid]['state'] = 'INACTIVE'
local_keys[uid]['key_details']['last_updated_timestamp_utc'] = datetime.now(timezone.utc).isoformat()
return gemini_key_exists
except google_exceptions.PermissionDenied:
logging.warning(f" Permission denied to list keys for project {project_id}. Skipping sync.")
return False
except google_exceptions.GoogleAPICallError as err:
logging.error(f" An API error occurred while syncing keys for project {project_id}: {err}")
return False
def process_project_for_action(project, creds, action, dry_run, db_lock, account_entry): def process_project_for_action(project, creds, action, dry_run, db_lock, account_entry):
"""Processes a single project for the given action in a thread-safe manner.""" """Processes a single project for the given action in a thread-safe manner."""
project_id = project.project_id project_id = project.project_id
logging.info(f"- Starting to process project: {project_id} ({project.display_name})") logging.info(f"- Starting to process project: {project_id} ({project.display_name})")
if action == 'create': if action == 'create':
if project_has_gemini_key(project_id, creds): gemini_key_exists = sync_project_keys(project, creds, dry_run, db_lock, account_entry)
if gemini_key_exists:
logging.info(f" 'Gemini API Key' already exists in project {project_id}. Skipping creation.") logging.info(f" 'Gemini API Key' already exists in project {project_id}. Skipping creation.")
return return
@@ -330,20 +435,6 @@ def remove_keys_from_database(account_entry, project_id, deleted_keys_uids):
if num_removed > 0: if num_removed > 0:
logging.info(f" Removed {num_removed} key(s) from local database for project {project_id}") logging.info(f" Removed {num_removed} key(s) from local database for project {project_id}")
def project_has_gemini_key(project_id, credentials):
"""Checks if a project already has a key named 'Gemini API Key'."""
try:
api_keys_client = api_keys_v2.ApiKeysClient(credentials=credentials)
parent = f"projects/{project_id}/locations/global"
keys = api_keys_client.list_keys(parent=parent)
for key in keys:
if key.display_name in ["Gemini API Key", "Generative Language API Key"]: # 2nd name if when api created using AI studio
return True
return False
except google_exceptions.GoogleAPICallError as err:
logging.error(f" Could not list keys in project {project_id}. Error: {err}")
return False
def get_credentials_for_email(email): def get_credentials_for_email(email):
"""Handles the OAuth2 flow for a given email.""" """Handles the OAuth2 flow for a given email."""
token_file = os.path.join(CREDENTIALS_DIR, f"{email}.json") token_file = os.path.join(CREDENTIALS_DIR, f"{email}.json")

View File

@@ -136,7 +136,8 @@
} }
}, },
"state": { "state": {
"type": "string" "type": "string",
"enum": ["ACTIVE", "INACTIVE"]
} }
} }
} }