Skip to content

[Chennai] Bhuvana Hareendran — Vibe Coding Submission#1215

Open
bhuvanahareem wants to merge 4 commits intonasscomAI:mainfrom
bhuvanahareem:participant/Bhuvana_Hareendran-Chennai
Open

[Chennai] Bhuvana Hareendran — Vibe Coding Submission#1215
bhuvanahareem wants to merge 4 commits intonasscomAI:mainfrom
bhuvanahareem:participant/Bhuvana_Hareendran-Chennai

Conversation

@bhuvanahareem
Copy link
Copy Markdown

Vibe Coding Workshop — Submission PR

Name: Bhuvana Hareendran
City / Group: Chennai
Date: 22 April 2026
AI tool(s) used: Antigravity (Gemini 3 Flash)


UC-0A — Complaint Classifier

Which failure mode did you encounter first?
Severity blindness.

What enforcement rule fixed it? Quote the rule exactly as it appears in your agents.md:

"Priority must be set to 'Urgent' if the description contains any of these keywords: injury, child, school, hospital, ambulance, fire, hazard, fell, collapse."

Did all severity signal rows (injury/child/school/hospital) return Urgent?
Yes.


UC-0B — Summary That Changes Meaning

Which failure mode did you encounter?
Clause omission and obligation softening.

After your fix — are all 10 critical clauses present in summary_hr_leave.txt?
Yes.

Did the naive prompt add any information not in the source document (scope bleed)?
Yes, it often adds phrases like "as is standard practice," which are not in the municipal text.


##UC-0C — Number That Looks Right
What did the naive prompt return when you ran "Calculate growth from the data."?
It returned a single aggregated growth percentage for all wards combined.

After your fix — does your system refuse all-ward aggregation?
Yes.

Does your growth_output.csv flag the 5 null rows rather than skipping them?
Yes, all 5 null rows (Kasba, Shivajinagar, Warje, Kothrud, Hadapsar) are flagged with reasons from the notes.


UC-X — Ask My Documents

What did the naive prompt return for the cross-document test question?
It blended IT and HR policies to incorrectly suggest that personal phones could be used for work files.

After your fix — what does your system return for this question?
It strictly cites IT Section 3.1 (email/portal access only) or provides the exact refusal template.

Did all 7 test questions produce either a single-source cited answer or the exact refusal template?
Yes.


CRAFT Loop Reflection

Which CRAFT step was hardest across all UCs, and why?
Refining the Enforcement rules. It required multiple iterations to ensure the agent didn't "helpfully" hallucinate information when the source was silent.

What is the single most important thing you added manually to an agents.md that the AI did not generate on its own?
The specific list of 9 safety keywords in UC-0A: "injury, child, school, hospital, ambulance, fire, hazard, fell, collapse."

Name one real task in your work where you will apply RICE + CRAFT within the next two weeks:
Automating the analysis of internal compliance documents for our team.

@github-actions
Copy link
Copy Markdown

👋 Hi there, participant! Thanks for joining our Vibe Coding Session!

We're reviewing your PR for the 4 User Cases. Once your submission is validated and merged, you'll be awarded your completion badge! 🏆

Next Steps:

  • Make sure all 4 UCs are finished.
  • Ensure your commit messages match the required format.
  • Good luck!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant