BIAS 2023
🚀Get ready for an action-packed week because we’re about to witness some truly MAMMOth contributions in the world of AI and fairness. https://shorturl.at/uHPX9
🚀Get ready for an action-packed week because we’re about to witness some truly MAMMOth contributions in the world of AI and fairness. https://shorturl.at/uHPX9
Dagmar Heeg & Dali Fekete – Centre for Learning and Teaching i.s.m. MAMMOth Workshop: Critical use of ChatGPTProgramma soort&SocietyGenreWorkshop Join our dynamic workshop on ChatGPT! We will start a dialogue about prejudice, from a Western… Read More »Workshop: Critical use of ChatGPT
Dagmar Heeg & Dali Fekete – Centre for Learning and Teaching i.s.m. MAMMOth Workshop: AI & Ethics: What do we want AI (not) to do? Don’t miss our in-depth workshop on artificial intelligence (AI) and… Read More »Ethics: What do we want AI (not) to do?
An overview of the Mammoth project as well as a summary of some WP3 tasks. This poster was presented at an open-day event organised by UniBw.
The official newsletter of MAMMOth Project [MAY2023] Call for Papers is out now! Fairness in Machine Learning continues to be a growing area of research and is perhaps now more relevant than ever, as new… Read More »May Newsletter
3rd Workshop on Bias and Fairness in AI Workshop at ECML PKDD 2023, 22nd of September,Torino (Italy) Call for Papers is out now! Fairness in Machine Learning continues to be a growing area of research and is… Read More »3rd Workshop on Bias and Fairness in AI
On 8 April 2019, the High-Level Expert Group on AI presented Ethics Guidelines for Trustworthy Artificial Intelligence. This followed the publication of the guidelines’ first draft in December 2018 on which more than 500 comments… Read More »Ethics guidelines for trustworthy AI
With our very first newsletter, we are introducing our new MAMMOth website! We are Live!!! This has been in the works for weeks and the launch date is finally here! We welcome visitors with featured… Read More »February Newsletter
Ob bei der Einschätzung von medizinischen Notfällen oder der Reihung von Jobinteressenten – KI-Systeme schreiben den Rassismus und Sexismus unserer Gesellschaft fort. Ein automatischer Seifenspender, der schwarze Menschen diskriminiert, weil seine auf Nahinfrarot basierenden Sensoren… Read More »Algorithmen lernen zu diskriminieren