Using GitHub Copilot Business Responsibly


What Happens to Your Code? 

  • Prompts are discarded once a suggestion is returned and GitHub does not store them.
  • Your code is not used to train GitHub's AI models (Business tier guarantee).
  • Usage metadata (interactions, latency, feature engagement) is retained for 24 months and shared with Microsoft. 

What Can Go Wrong? 

  • Secrets leakage: Copilot may suggest code containing credentials or keys. Always review suggestions before accepting — never treat them as automatically safe.
  • Licensing: Suggestions are trained on public code. Some may resemble code under restrictive open-source licenses, so do verify unfamiliar snippets.
  • Hallucinated packages: Copilot may suggest libraries or dependencies that don't exist or are malicious. Always validate imports against known package registries.
  • Overconfident output: Suggestions can look correct but contain subtle security vulnerabilities. Code review practices remain essential. Copilot is a collaborator, not a guarantor. 

Appropriate Use 

Appropriate 

Inappropriate 

Internal tooling and scripts (non-sensitive) 

Credentials and passwords 

Code documentation 

Level 4+ institutional data 

Refactoring and debugging 

Regulated research data 

Open source/public-facing projects 

API keys, tokens, secrets 

  

Recommended Reading 

GitHub Copilot Trust Center 

GitHub Copilot: Key Risks & Best Practices