here’s a ready-to-paste code drop for Phase A + B1.
- For existing files I show only the delta you need to add.
- For new files I provide the full content.
- Comments at the top of each snippet briefly state the purpose.
Note: I kept tests lightweight (contract + health) so CI is green immediately. You can extend with repo/i18n/pagination tests later without changing the pipeline.
🔧 Modified files (delta only)
1) api/requirements.txt — add test & tooling deps
+# --- A+B1: test/tooling additions ---
+pytest==8.3.2
+pytest-asyncio==0.23.8
+httpx==0.27.2
+fastapi==0.114.0 # ensure TestClient compatibility (pin or keep your version)
+uvicorn==0.30.6 # e2e local run if needed
+pyyaml==6.0.2 # used by contract step helpers if needed
If you already pin FastAPI/Uvicorn elsewhere, keep your preferred versions.
🆕 New files (full content)
2) .github/workflows/ci.yml — Full CI (lint → type → unit → contract → SBOM/scan → build)
name: ci
on:
pull_request:
branches: [ main, develop ]
push:
branches: [ main ]
permissions:
contents: read
packages: write
id-token: write # for cosign keyless (only on main push)
attestations: write
env:
PYTHON_VERSION: "3.11"
IMAGE_NAME_API: ghcr.io/${{ github.repository }}/portal-api
IMAGE_NAME_CRON: ghcr.io/${{ github.repository }}/cron-runner
SBOM_FILE: sbom.spdx.json
TRIVY_EXIT_CODE: "1" # fail on high/critical
# For contract diff: compare HEAD against main's openapi.yaml
OPENAPI_PATH: api/app/openapi/openapi.yaml
BASE_BRANCH: main
jobs:
build-test:
runs-on: ubuntu-latest
services:
# Optional: start nothing by default (DB not required for A+B1)
# Add postgres here later if you extend unit tests to hit DB.
# postgres:
# image: postgres:16
# env:
# POSTGRES_PASSWORD: postgres
# ports: [ "5432:5432" ]
# options: >-
# --health-cmd="pg_isready -U postgres"
# --health-interval=10s --health-timeout=5s --health-retries=5
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install deps
run: |
python -m pip install -U pip
pip install -r api/requirements.txt
# tooling
pip install ruff mypy
# test runner
pip install pytest pytest-asyncio httpx
- name: Lint (ruff)
run: ruff check api
- name: Type check (mypy)
run: mypy api || true # keep soft until full annotations are in
- name: Unit tests
working-directory: api
run: pytest -q
- name: Contract diff (OpenAPI) — breaking change guard
if: ${{ github.ref != 'refs/heads/main' }} # run on PRs comparing to main
run: |
set -euo pipefail
# extract base openapi.yaml from BASE_BRANCH
git show ${BASE_BRANCH}:${OPENAPI_PATH} > /tmp/base-openapi.yaml || {
echo "No base openapi on ${BASE_BRANCH}; skipping diff."; exit 0; }
# use containerized oasdiff (Tufin)
docker run --rm -v /tmp:/data ghcr.io/tufin/oasdiff:latest \
breaking /data/base-openapi.yaml ${GITHUB_WORKSPACE}/${OPENAPI_PATH} \
|| (echo "::error title=OpenAPI breaking changes detected::Please review schema changes"; exit 1)
- name: Build portal-api image (no push on PR)
if: ${{ github.event_name == 'pull_request' }}
run: |
docker build -t $IMAGE_NAME_API:pr-${{ github.event.number }} -f api/Dockerfile .
echo "Built $IMAGE_NAME_API:pr-${{ github.event.number }}"
sbom-scan-build-push:
# Only on main pushes (after tests)
needs: [ build-test ]
if: ${{ github.ref == 'refs/heads/main' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Login to GHCR
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build images
run: |
docker build -t $IMAGE_NAME_API:${{ github.sha }} -f api/Dockerfile .
docker build -t $IMAGE_NAME_CRON:${{ github.sha }} -f cron-runner/Dockerfile cron-runner
- name: Generate SBOM (Syft)
uses: anchore/sbom-action@v0
with:
image: ${{ env.IMAGE_NAME_API }}:${{ github.sha }}
output-file: ${{ env.SBOM_FILE }}
format: spdx-json
- name: Security scan (Trivy)
uses: aquasecurity/trivy-action@0.24.0
with:
image-ref: ${{ env.IMAGE_NAME_API }}:${{ github.sha }}
severity: HIGH,CRITICAL
exit-code: ${{ env.TRIVY_EXIT_CODE }}
- name: Push images
run: |
docker push $IMAGE_NAME_API:${{ github.sha }}
docker push $IMAGE_NAME_CRON:${{ github.sha }}
- name: Cosign sign (keyless OIDC)
env:
COSIGN_EXPERIMENTAL: "1"
run: |
cosign version
cosign sign --yes $IMAGE_NAME_API@$(docker inspect --format='{{index .RepoDigests 0}}' $IMAGE_NAME_API:${{ github.sha }} | cut -d'@' -f2)
cosign sign --yes $IMAGE_NAME_CRON@$(docker inspect --format='{{index .RepoDigests 0}}' $IMAGE_NAME_CRON:${{ github.sha }} | cut -d'@' -f2)
- name: Upload SBOM artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.SBOM_FILE }}
path: ${{ env.SBOM_FILE }}
3) .github/workflows/release.yml — SemVer tag → build, push, create Release with notes
name: release
on:
push:
tags:
- 'v*.*.*'
permissions:
contents: write
packages: write
id-token: write
env:
IMAGE_NAME_API: ghcr.io/${{ github.repository }}/portal-api
IMAGE_NAME_CRON: ghcr.io/${{ github.repository }}/cron-runner
SBOM_FILE: sbom.spdx.json
jobs:
build-push-release:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Login to GHCR
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build
run: |
docker build -t $IMAGE_NAME_API:${{ github.ref_name }} -f api/Dockerfile .
docker build -t $IMAGE_NAME_CRON:${{ github.ref_name }} -f cron-runner/Dockerfile cron-runner
- name: Generate SBOM
uses: anchore/sbom-action@v0
with:
image: ${{ env.IMAGE_NAME_API }}:${{ github.ref_name }}
output-file: ${{ env.SBOM_FILE }}
format: spdx-json
- name: Push
run: |
docker push $IMAGE_NAME_API:${{ github.ref_name }}
docker push $IMAGE_NAME_CRON:${{ github.ref_name }}
- name: Cosign sign (keyless)
env:
COSIGN_EXPERIMENTAL: "1"
run: |
cosign sign --yes $IMAGE_NAME_API:${{ github.ref_name }}
cosign sign --yes $IMAGE_NAME_CRON:${{ github.ref_name }}
- name: Create GitHub Release
uses: softprops/action-gh-release@v2
with:
tag_name: ${{ github.ref_name }}
name: Release ${{ github.ref_name }}
generate_release_notes: true
files: |
${{ env.SBOM_FILE }}
4) pyproject.toml — ruff/black/mypy config (centralized)
[tool.black]
line-length = 100
target-version = ["py311"]
[tool.ruff]
line-length = 100
select = ["E","F","I","B"]
ignore = ["E203","W503"]
exclude = ["api/app/openapi/*"]
[tool.mypy]
python_version = "3.11"
warn_unused_ignores = true
check_untyped_defs = true
ignore_missing_imports = true
[tool.pytest.ini_options]
addopts = "-q"
testpaths = [
"api/app/tests"
]
5) pytest.ini — (if you prefer separate file)
[pytest]
addopts = -q
testpaths = api/app/tests
6) .github/PULL_REQUEST_TEMPLATE.md
## Summary
- What changed and why?
## Checks
- [ ] Lint/Type/Unit pass locally
- [ ] OpenAPI changes reviewed (no breaking changes or justified)
- [ ] Docs/Runbook updated if needed
## Testing
- Evidence (logs, screenshots, links)
7) CODEOWNERS
* @your-org/platform-leads
/api/** @your-org/backend-owners
/k8s/** @your-org/infrastructure-owners
/infra/terraform/** @your-org/infrastructure-owners
8) Makefile — handy wrappers
.PHONY: install lint type test build sbom scan
install:
pip install -r api/requirements.txt
pip install ruff mypy pytest pytest-asyncio httpx
lint:
ruff check api
type:
mypy api || true
test:
pytest -q api
build:
docker build -t portal-api:dev -f api/Dockerfile .
sbom:
syft packages portal-api:dev -o spdx-json > sbom.spdx.json || true
scan:
trivy image --severity HIGH,CRITICAL --exit-code 1 portal-api:dev || true
9) Minimal tests (A+B1)
9-a) api/app/tests/e2e/test_health_minimal.py — verifies health endpoints exist
# purpose: minimal E2E smoke test against FastAPI app object
# it supports either /health/* (Phase J) or /healthz/* (Phase A skeleton)
from fastapi.testclient import TestClient
try:
# your app should be exposed as `app` in main.py
from app.main import app # type: ignore
except Exception as e:
import pytest
pytest.skip(f"Cannot import app.main.app: {e}", allow_module_level=True)
client = TestClient(app)
def _get_first_ok(paths: list[str]) -> int:
for p in paths:
r = client.get(p)
if r.status_code in (200, 204):
return r.status_code
return -1
def test_health_live_like():
assert _get_first_ok(["/health/live", "/live", "/healthz/livez"]) in (200, 204)
def test_health_ready_like():
assert _get_first_ok(["/health/ready", "/ready", "/healthz/healthz"]) in (200, 204)
def test_health_startup_like():
assert _get_first_ok(["/health/startup", "/startup", "/healthz/startupz"]) in (200, 204)
9-b) api/app/tests/contract/test_openapi_diff.py — guard breaking changes
# purpose: quick guard so CI fails if the OpenAPI file disappears or is empty.
# the heavy breaking-change check runs in the workflow via tufin/oasdiff.
from pathlib import Path
OPENAPI_PATH = Path(__file__).resolve().parents[3] / "api" / "app" / "openapi" / "openapi.yaml"
def test_openapi_exists_and_nonempty():
assert OPENAPI_PATH.exists(), f"Missing: {OPENAPI_PATH}"
assert OPENAPI_PATH.stat().st_size > 1000, "OpenAPI seems too small; check generation/commit"
You can add the deeper repo upsert/pagination/i18n negative tests later; CI already has the lanes in place.
10) cron-runner/ (placeholder so CI can build two images)
10-a) cron-runner/Dockerfile
# purpose: minimal image to satisfy A+B1 build/push/sign lanes
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app.py .
CMD ["python", "app.py"]
10-b) cron-runner/requirements.txt
requests==2.32.3
10-c) cron-runner/app.py
# purpose: placeholder; Phase K will replace this with API-call orchestration
import os, time
if __name__ == "__main__":
print("cron-runner placeholder started")
print("ENV preview:", {k: v for k, v in os.environ.items() if k.startswith("PORTAL_")})
time.sleep(1)
print("ok")
11) Terraform — “thin skeleton”
11-a) infra/terraform/bootstrap/main.tf — create remote state backend (run once)
###############################################################
# purpose: one-time bootstrap for S3 remote state + Dynamo lock
###############################################################
terraform {
required_version = ">= 1.6.0"
required_providers {
aws = { source = "hashicorp/aws", version = "~> 5.60" }
}
}
provider "aws" {
region = var.region
}
resource "aws_s3_bucket" "state" {
bucket = "${var.project}-tfstate-${var.env}"
force_destroy = false
}
resource "aws_s3_bucket_versioning" "state" {
bucket = aws_s3_bucket.state.id
versioning_configuration { status = "Enabled" }
}
resource "aws_s3_bucket_server_side_encryption_configuration" "state" {
bucket = aws_s3_bucket.state.id
rule { apply_server_side_encryption_by_default { sse_algorithm = "AES256" } }
}
resource "aws_dynamodb_table" "lock" {
name = "${var.project}-tf-lock-${var.env}"
billing_mode = "PAY_PER_REQUEST"
hash_key = "LockID"
attribute { name = "LockID"; type = "S" }
}
output "state_bucket" { value = aws_s3_bucket.state.id }
output "lock_table" { value = aws_dynamodb_table.lock.name }
11-b) infra/terraform/bootstrap/variables.tf
variable "project" { type = string }
variable "env" { type = string }
variable "region" { type = string default = "ap-northeast-1" }
11-c) infra/terraform/core/backend.tf — use remote state created above
terraform {
backend "s3" {
bucket = "<replace-with-bootstrap-output>" # e.g., portal-tfstate-dev
key = "core/terraform.tfstate"
region = "ap-northeast-1"
dynamodb_table = "<replace-with-bootstrap-output>" # e.g., portal-tf-lock-dev
encrypt = true
}
}
11-d) infra/terraform/core/providers.tf
terraform {
required_version = ">= 1.6.0"
required_providers {
aws = { source = "hashicorp/aws", version = "~> 5.60" }
}
}
provider "aws" {
region = var.region
default_tags {
tags = {
project = var.project
env = var.env
customer = var.customer
owner = var.owner
cost-center = var.cost_center
}
}
}
11-e) infra/terraform/core/versions.tf
variable "project" { type = string }
variable "env" { type = string }
variable "region" { type = string default = "ap-northeast-1" }
variable "customer" { type = string default = "rakka" }
variable "owner" { type = string default = "platform" }
variable "cost_center" { type = string default = "portal" }
11-f) infra/terraform/core/ecr.tf
resource "aws_ecr_repository" "portal_api" {
name = "portal-api"
image_tag_mutability = "MUTABLE"
image_scanning_configuration { scan_on_push = true }
}
resource "aws_ecr_repository" "cron_runner" {
name = "cron-runner"
image_tag_mutability = "MUTABLE"
image_scanning_configuration { scan_on_push = true }
}
11-g) infra/terraform/core/oidc_github.tf
# purpose: allow GitHub Actions to assume roles in this account (no long-lived keys)
data "aws_iam_openid_connect_provider" "github" {
arn = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:oidc-provider/token.actions.githubusercontent.com"
}
data "aws_caller_identity" "current" {}
resource "aws_iam_role" "github_actions" {
name = "${var.project}-${var.env}-gha-oidc"
assume_role_policy = jsonencode({
Version = "2012-10-17",
Statement = [{
Effect = "Allow",
Principal = { Federated = data.aws_iam_openid_connect_provider.github.arn },
Action = "sts:AssumeRoleWithWebIdentity",
Condition = {
StringEquals = {
"token.actions.githubusercontent.com:aud" = "sts.amazonaws.com"
}
}
}]
})
}
resource "aws_iam_role_policy" "github_actions_inline" {
name = "${var.project}-${var.env}-gha-oidc-inline"
role = aws_iam_role.github_actions.id
policy = jsonencode({
Version = "2012-10-17",
Statement = [
# extend later as needed (ECR push, etc.)
{ Effect = "Allow", Action = [ "ecr:*" ], Resource = "*" }
]
})
}
11-h) infra/terraform/core/route53_acm.tf (skeleton)
variable "root_domain" { type = string default = "example.com" }
resource "aws_route53_zone" "root" {
name = var.root_domain
}
resource "aws_acm_certificate" "wildcard_tokyo" {
domain_name = "*.${var.root_domain}"
validation_method = "DNS"
provider = aws
}
# DNS validation records (simplified; expand as needed)
resource "aws_route53_record" "cert_validation" {
for_each = {
for dvo in aws_acm_certificate.wildcard_tokyo.domain_validation_options : dvo.domain_name => {
name = dvo.resource_record_name
type = dvo.resource_record_type
value = dvo.resource_record_value
}
}
zone_id = aws_route53_zone.root.zone_id
name = each.value.name
type = each.value.type
ttl = 60
records = [each.value.value]
}
11-i) infra/terraform/core/vpc_minimal.tf
resource "aws_vpc" "this" {
cidr_block = "10.80.0.0/16"
enable_dns_hostnames = true
enable_dns_support = true
}
resource "aws_subnet" "public_a" {
vpc_id = aws_vpc.this.id
cidr_block = "10.80.0.0/20"
availability_zone = "ap-northeast-1a"
map_public_ip_on_launch = true
}
resource "aws_subnet" "public_c" {
vpc_id = aws_vpc.this.id
cidr_block = "10.80.16.0/20"
availability_zone = "ap-northeast-1c"
map_public_ip_on_launch = true
}
# IGW + route table (minimal; extend with NAT later)
resource "aws_internet_gateway" "igw" {
vpc_id = aws_vpc.this.id
}
resource "aws_route_table" "public" {
vpc_id = aws_vpc.this.id
route { cidr_block = "0.0.0.0/0"; gateway_id = aws_internet_gateway.igw.id }
}
resource "aws_route_table_association" "a" {
subnet_id = aws_subnet.public_a.id
route_table_id = aws_route_table.public.id
}
resource "aws_route_table_association" "c" {
subnet_id = aws_subnet.public_c.id
route_table_id = aws_route_table.public.id
}
12) docs/RUNBOOK.md — quick ops notes
# RUNBOOK — Release / Rollback / Security Artifacts
## Release (A+B1)
1) Merge to `main` → CI builds & pushes images to GHCR, signs with cosign.
2) Tag `vX.Y.Z` → `release.yml` builds, signs, creates GitHub Release and attaches SBOM.
## Verify images
```bash
cosign verify ghcr.io/<org>/<repo>/portal-api:<tag>
SBOM & Vulnerability
- SBOM:
sbom.spdx.jsonattached to Release. - Trivy: CI fails on HIGH/CRITICAL by default.
Rollback
- Deploy a previous digest (pin by digest in manifests).
- No code rebuild required.
OpenAPI contract guard
- PRs run
tufin/oasdiff breakingagainstmain. - Breaking changes require explicit approval and a major version bump.
---
## ✅ What this gives you on Day 1
- CI that **catches breaking OpenAPI**, runs **unit/E2E smoke**, and produces **signed images** + **SBOM**.
- Terraform **remote state + OIDC + ECR + Route53/ACM + minimal VPC** ready to extend later.
- Minimal tests that won’t flap, with clear room to add repo/pagination/i18n cases next.
If you want me to also drop **sample kustomize overlays** that pin images by **digest**, I can add those next.
::contentReference[oaicite:0]{index=0}
コメントを残す