Governance Built for Trust

Transparency, accountability, and rapid decision-making. As an independent, non-governmental organization modeled after ISO, TIA, and IEC, our governance structure ensures all stakeholders have a voice while maintaining the agility needed for effective action.

Note: This is an exemplar draft to demonstrate the concept. The governance structure and metrics framework will be officially formulated through community input.

How We’re Organized

Internal Structure

OSC Organizational Structure

External Stakeholders

OSC Stakeholder Model

Funding Allocation

Funding flow: Research Grants contribute 0.1% to OSC Fund, evaluated by Impact Metrics, distributed to Software Projects

Funding decisions are based on transparent metrics: usage statistics, academic citations, community health, and innovation potential.

Key Principles

Member Rights

  • Voting rights in governance decisions
  • Representation on technical committees
  • Access to shared resources and services

Member Responsibilities

  • Financial contributions based on research volume
  • Participation in standards development
  • Community engagement and support

Evidence-Based Funding

Every funding decision backed by transparent, verifiable data. Our multi-dimensional metrics framework ensures resources reach the tools researchers actually depend on.

How We Measure Impact

Metrics Framework: Usage (40%), Academic Impact (30%), Community Health (20%), Innovation (10%)

Our weighted scoring system balances breadth of adoption with depth of impact, ensuring both widely-used tools and specialized critical infrastructure receive appropriate support.

Data Collection Principles

Privacy-First

Aggregated and anonymized metrics. User consent for any individual data. Clear opt-out mechanisms.

Automated

Package manager APIs, academic database crawling, repository analytics. Minimal burden on maintainers.

Transparent

Public dashboard showing real-time funding allocation, impact metrics, and complete audit trail.

Verifiable

All metrics independently auditable. Methodology published. Regular third-party reviews.

Success Indicators

For Software

  • Reduced bugs and security vulnerabilities
  • Improved stability and performance
  • Faster issue resolution

For Research

  • Increased reproducibility
  • Less time on technical troubleshooting
  • Enhanced cross-institution collaboration

For Maintainers

  • Stable employment and career paths
  • Professional recognition
  • Reduced burnout

Our Commitment to You

Transparency

All decisions, metrics, and financial information are publicly available.

Accountability

Democratic processes and independent oversight ensure community trust.

Efficiency

Streamlined processes enable quick decisions without sacrificing quality.

Impact

Every decision maximizes positive impact on research software sustainability.