Scholarship

Eligibility & How to Apply CILAR/CRRF National Anti-Racism Fund Algorithm for Change Program 2025

Are you ready to showcase your skill and  make change in your career? If yes, Apply CILAR/CRRF National Anti-Racism Fund Algorithm for Change Program 2025. The best solution presented during the hackathon series will receive recognition. Read more.

CILAR, in partnership with the CRRF National Anti-Racism Fund, is collaborating with the Toronto Public Library to host a series of hackathons to empower underrepresented Canadians to address hate speech using AI-driven solutions. Participants will learn how to detect and counter hate speech while driving influential change within corporate algorithms to promote racial equity. By joining, you’ll be part of a community that creates real change, helping to combat algorithmic bias and fostering diversity in the tech industry.

Application Deadline: 13 December 2024

What Participants Can Expect:

Participant Sign-Ups

  • Join as an individual or create a team with friends, peers, or colleagues.
  • Meet other likeminded individuals curious about AI.
  • Great team-building initiative, particularly for businesses and tech organizations.

Curriculum Focus

  • Develop tangible skills that can advance participants’ careers or business.
  • Focus on real-world AI applications in combating bias and developing inclusive technologies\
  • Leverage these learnings into your business or workplace.

Learning Format

  • Collaborative sessions (both in-person and virtual).
  • Expert guest speakers from tech and social justice sectors.
  • Hands-on experience with AI tools used to address algorithmic bias.
  • Networking with leaders and professionals in AI, technology, and social justice.

Benefit:

  • Make an Impact: Be part of a solution-driven community working to combat hate speech and address algorithmic bias through AI.
  • Network Opportunities: Collaborate with tech leaders and social justice advocates, gaining valuable connections.
  • Skill Development: Learn hands-on AI tools and methods to develop real-world solutions that promote inclusivity.
  • Recognition: Top solutions will be showcased to corporate partners and investors, with potential for further development and certification.
  • Workshops & Mentorship: Participants will attend sessions led by experts on AI ethics and bias detection, receive hands-on experience with AI tools, and collaborate on real-world AI solutions.
  • ‍Team-Based Projects: Teams will create solutions to counter hate speech and algorithmic bias, with guidance from mentors.

Eligibility:

  • Target Audience: Underrepresented Canadians, particularly individuals aged 18-40 with an interest in technology, AI, innovation, and social justice.
  • Demographic Focus: Socioeconomically marginalized groups seeking to make an impact through technology.

How to Apply CILAR/CRRF National Anti-Racism Fund Algorithm for Change Program 2025

Interested candidates should Click here to apply 

Mark

Recent Posts

Apply Aperture Portfolio Prize 2025 (Up to $5,000 cash prize)

The Aperture Portfolio Prize 2025 portal are now open, The Aperture Portfolio Prize is an…

3 days ago

Apply STERN-Preis Photo Contest 2025 (Fully Funded)

The STERN-Preis Photo Contest 2025 portal are now open, The STERN-Preis 2025 – Photo Story…

3 days ago

Apply World Press Photo Contest 2025 ( Up to €10,000)

The  World Press Photo Contest 2025 portal are now open, The annual World Press Photo…

3 days ago

Eligibility to Apply National Press Photographers Foundation (NPPF) Contest 2025 (Up to $15,000)

The National Press Photographers Foundation (NPPF) Contest 2025 portal are now open, The recipient will…

3 days ago

Guest Room and Joanna Zylinska & Yanai Toister Photographer Contest 2025

The Guest Room: Joanna Zylinska & Yanai Toister Photographer Contest 2025 portal are now open,…

3 days ago

Eligibility to Apply Star Wars Outlaws In-game Photography Contest 2025

The Star Wars Outlaws In-game Photography Contest 2025 portal are now open, Take in-game photograph…

3 days ago

This website uses cookies.