The BMO Lab presents two events with internationally renowned artist Kyle McDonald

When and Where

Saturday, February 29, 2020 1:00 pm to Monday, March 02, 2020 6:00 am
BMO Lab
Koffler Student Services Centre
214 College Street, Toronto, ON M5T 2Z9

Speakers

Kyle McDonald

Description

The BMO Lab for Creative Research in the Arts, Performance, Emerging Technologies and Artificial Intelligence is very pleased to present two events with internationally renowned artist Kyle McDonald. 
 
Workshop: Saturday February 29th, 1-4 pm
Artist’s Talk / Lecture: Monday March 2, 4:30-6 pm
 
Kyle McDonald is an artist working with code. He crafts interactive installations, performances, sneaky interventions, playful websites, workshops, and toolkits for other artists working with code. He explores the possibilities of new technologies: to understand how they affect society, to misuse them, and build alternative futures. He works with machine learning, computer vision, social and surveillance tech. He has been an adjunct professor at NYU's ITP, a member of F. A.T. Lab (Free Art and Technology), community manager for openFrameworks, and artist in residence at STUDIO for Creative Inquiry at CMU, and YCAM in Japan. His work has been commissioned and shown around the world, including: the V & A (London), NTT ICC (Tokyo), Ars Electronica (Linz, Austria), Sonar (Barcelona), Today's Art (The Hague), and Eyebeam (NYC).

On Monday March 2, at 4:30, he will be presenting an artist talk focusing on his work using AI / Machine Learning in the context of performance.

He will be leading a workshop on February 29th from 1-4 introducing participants to using the open-source web-based programming environment p5.js for creating interactive systems using computer vision and machine learning.

Both events are happening at the BMO Lab in the Koffler building at 214 College Street. (Access via the doors on St. George, just north of College Street. Come to the 3rd floor, and go through the door on the left. Follow the hallway to the BMO Lab).

Workshop description:
This hands-on workshop will begin by revisiting the basics of coding with p5.js, including drawing, animation, and interactivity. We will then cover computer vision techniques based on simple pixel processing and machine learning, with a focus on tracking bodies, faces and hands. p5.js is a JavaScript library designed to make coding accessible for artists, designers, and educators. “Computer vision” refers to a broad collection of techniques that allow computers to make intelligent assertions about what's going on in digital images and video. “Machine learning” refers to explaining tasks to computers via examples (training data) instead of instructions (code). Using p5.js we can quickly leverage the power of new computer vision algorithms built on machine learning to create camera-driven interactive art work. We will discuss the ml5.js toolkit, and how it fits into the broader ecosystem of modern machine learning tools. We will use ml5.js to detect common objects in front of the webcam, and train a custom classifier that can distinguish between personal objects in front of the webcam. The class will adapt to the familiarity of the students: if the fundamentals of creative coding already well understood, by the end of the workshop we will be discuss higher-level machine learning concepts like generative adversarial networks for image generation and recurrent neural networks for text and music generation without focusing on these topics.
 
You must pre-register (e-mail to david.rokeby@utoronto.ca) if you wish to participate in the workshop. There is limited capacity.