Category: Letters

  • Letter to Cineplex on Facial Detection

    In response to Cineplex Digital Media’s digital billboards performing facial detection, we submitted the following letter.

    See PDF version here.


    Thursday, December 4, 2025

    Dear Chief Privacy Officer at Cineplex Digital Media:

    We are writing to you with regards to privacy concerns surrounding your usage of Anonymous Video Analytics (AVA) in digital signage near Union Station Bus Terminal (USBT) and elsewhere in Canada. We represent both technologists and regular Canadians who see the value in innovation and technological development for public good.

    In 2020, the Office of the Privacy Commissioner of Canada (OPC), as well as the privacy commissioners for Alberta and British Columbia, released a report on a case involving use of AVA technology by Cadillac Fairview (CF) in 2018.1 The report found that CF’s AVA deployment constituted a violation of privacy, and that “express opt-in consent would be required, as they determined that some of the information involved was sensitive and its surreptitious collection in this context would be outside the reasonable expectations of consumers.”2

    We have a number of concerns relating to the aforementioned report and would like some clarifications.

    Concern 1: How has the joint investigation conducted in 2020 by the Offices of the Privacy Commissioner of Canada, Alberta and British Columbia affected CDM’s deployment of AVA technology? 

    Concern  2: A Toronto Star article by Kevin Jiang also mentions that CDM has consulted with the OPC for this project.3 How has this consultation mitigated some of the privacy issues that have arisen in the use of AVA?

    We also have concerns with certain statements in the privacy notice affixed to AVA-enabled digital signage by Cineplex (CDM).4

    Statement 1: This media unit runs anonymous software, used to generate statistics about audience counts, gender and approximate age only.

    Concern 3: What is the intended meaning of the phrase “anonymous software” per Statement 1 (also see Figure 1 in the Appendix)?

    Statement 2: Images are processed in a few milliseconds before being immediately and permanently deleted.

    Concern 4: If images are processed in the matter of milliseconds before being permanently deleted, this implies that images are only processed on-device, as opposed to being sent to a remote server or cloud service. Are images processed on-device?

    Concern 5: Is any personally identifying information or biometric data collected, inferred, extracted or stored beyond statistics on audience counts, gender and approximate age?

    Concern 6: Does the data collected by AVA reside in Canada?

    Concern 7: What measures are in place to protect data privacy during transfer and storage of data in remote servers or cloud facilities?

    We also have a number of concerns with certain statements in your notice of disclosure:5

    Statement 3: We ensure that the public is well informed as [sic] the presence of anonymous video analytics systems by placing signage and stickers on kiosks, at property entrance, exit ways and other places along the path to the property.

    Concern 8: Upon an investigation of Union Station Bus Terminal in early November 2025 by members of Technologists for Democracy, no warning of AVA systems was found except those attached to display signage themselves. This runs contrary to the above statement.

    Statement 4: Camera sensors are installed in plain sight and are never hidden. We want the public to understand exactly where they are placed so, if they chose, they can avoid it.

    Concern 9: Cameras attached to Cineplex’s digital signage are visible but quite small and difficult to identify. Over 90% of the general public interviewed by members of Technologists for Democracy were not previously aware of cameras attached to said signage. Areas where the digital ads are currently placed adjacent to USBT are unavoidable for individuals passing by in hallways coming from Union Station (i.e., two digital ads are placed very close to a train timetable, while the other is facing an entrance to USBT itself). This and concern 8 run contrary to the statements that cameras are “in plain sight”, “never hidden” and “if they [the public] chose, they can avoid it.”

    We also have a number of concerns with certain statements in your privacy policy:6

    Statement 5: Other Uses: We may also use Personal Information, where necessary, for:
    establishing, maintaining and/or fulfilling relationships with our business partners, third-party vendors of products and/or services, as well as our corporate and business customers;

    Concern 10: Are images of individuals sold to any third parties?

    Concern 11: Are statistics on audience counts, gender and approximate age sold to any third parties?

    Concern 12: Are any other data collected by AVA sold to third parties?

    Statement 6: Cineplex may also share Personal Information necessary to meet legal, audit, regulatory, insurance, security or other similar requirements. For instance, Cineplex may be compelled to disclose Personal Information in response to a law, regulation, court order, subpoena, valid demand, search warrant, government investigation or other legally valid request or enquiry. We may also share information with our accountants, auditors, agents and lawyers in connection with the enforcement or protection of our legal rights.

    Concern 13: If data collected by AVA systems or digital signage is requested and sent to third parties for legally valid requests or enquiries, is there a mechanism for informing the public?

    Concern 14: If data collected by AVA systems or digital signage is requested and sent to third parties for legally valid requests or enquiries, is there a mechanism for third-party audits of such data requests or enquiries?

    Concern 15: It has been reported that CDM is being sold to US-based Creative Realities,7 who will take over ad billboards spanning malls and office buildings. Per this deal, will Creative Realities also take ownership of AVA systems? 

    We request that Cineplex Digital Media provide clarity on:

    1. How you are ensuring that individuals captured by AVA systems remain anonymous.
    2. How you are ensuring individuals are properly informed of recording and facial detection performed on premises.
    3. Your data policy and transparency around sharing of information to third parties.
    4. How your AVA system differs from Cadillac Fairview’s AVA system involved in the joint investigation conducted in 2020 by the Offices of the Privacy Commissioner of Canada, Alberta and British Columbia. 

    Innovation does not have to come at the cost of personal privacy. We are concerned with the lack of clarity surrounding the privacy notice and the privacy statements in CDM’s website. We are also alarmed at the prospect of a foreign company taking over CDM’s ad billboards to service software that may operate outside of Canadian laws and regulations.

    We would greatly appreciate a response within 10 business days. Please reply to this email or reach out directly to Khasir Hean at khasir.hean@gmail.com / 226-927-2677 if you have any questions or would like to discuss details further.

    Best,

    Adam Motaouakkil
    [email redacted for privacy

    Jitka Bartosova
    [email redacted for privacy

    Khasir Hean
    khasir.hean@gmail.com

    Technologists for Democracy
    techfordemocracy.ca 

    Signing organizations:

    Canadian Tech for Good
    Nikita Desai
    [email redacted for privacy

    More Transit Southern Ontario (MTSO)
    Jonathan Lee How Cheong
    [email redacted for privacy
    https://www.moretransit.ca/ 

    OpenMedia
    Matt Hatfield
    [email redacted for privacy
    https://openmedia.org/

    Tech Workers Coalition Canada
    Jenny Zhang
    [email redacted for privacy
    https://techworkerscoalition.org/canada/

    TTCriders
    Andrew Pulsifer
    [email redacted for privacy
    https://www.ttcriders.ca/

    Signing individuals:

    [names and emails of 25 individuals redacted for privacy]

    Appendix

    Privacy notice attached to digital signage by Cineplex Digital Media.
    Figure 1: Privacy notice attached to digital signage by Cineplex Digital Media.

    1. Joint investigation of the Cadillac Fairview Corporation Limited by the Privacy Commissioner of Canada, the Information and Privacy Commissioner of Alberta, and the Information and Privacy Commissioner for British Columbia. October 28, 2020. https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2020/pipeda-2020-004/ ↩︎
    2. Anonymous video analytics’ future uncertain after Canadian privacy regulators’ investigation. November 4, 2020. https://www.blg.com/en/insights/2020/11/anonymous-video-analytics-future-uncertain-after-canadian-priv acy-regulator-investigation ↩︎
    3. Jiang, Kevin. These ads near Union Station and other places around Toronto could be recording you. What you need to know. November 5, 2025. https://www.thestar.com/news/gta/these-ads-near-union-station-and-other-places-around-toronto-could-be-recording-you-what/article_7af7c920-1ce7-4b19-98db-4c22d742f202.html ↩︎
    4. See Appendix for the privacy notice in question (Figure 1). ↩︎
    5. Information on AVA | CDM. https://www.cdmexperiences.com/information-on-ava ↩︎
    6. Privacy Policy | CDM. Effective date April 10, 2024. https://www.cdmexperiences.com/privacy-policy ↩︎
    7. Deschamps, Tara. Cineplex selling digital signage unit to U.S. company Creative Realities for $70M. October 16, 2025. https://toronto.citynews.ca/2025/10/16/cineplex-digital-media-sale-signage/ ↩︎

  • Stop Cineplex from Facial Detection

    November 2025 — Now

    TL;DR: Cineplex Digital Media (CDM) makes digital billboards with cameras performing facial detection.

    One of the digital billboards that are performing facial detection at Union Station Bus Terminal.

    In early November, a Redditor discovered that ads at Union Station Bus Terminal have tiny cameras attached. The small privacy notice on the ads indicated that they run “anonymous software” to “generate statistics about audience counts, gender and approximate age”. Visiting their website tells us that they perform facial detection on anyone nearby!

    The privacy notice in question.

    CDM Cineplex Digital Media

    This media unit runs anonymous software, used to generate statistics about audience counts, gender and approximate age only.

    To ensure your privacy, no images and no data unique to an individual person is recorded by the camera in this unit. Images are processed in a few milliseconds before being immediately and permanently deleted.

    More information on the anonymous software and our Privacy Policy can be found at

    www.cdmexperiences.com/information-on-ava

    or scan the QR code below.

    News agencies quickly covered the issue, with articles coming out at Now Toronto, the Toronto Star, CTV News and Global News.

    Close up of the camera on top of the billboard.

    Volunteers at TfD have written an open letter to CDM. Organizations including OpenMedia, TTCriders and More Transit Southern Ontario have signed on, calling for CDM to answer our privacy concerns.

  • Letter to ISED on National AI Strategy

    In response to Innovation, Science and Economic Development Canada’s (ISED) 30-day national sprint on shaping Canada’s AI strategy, we submitted the following letter.

    See original PDF here.


    October 31, 2025

    Re: Help define the next chapter of Canada’s AI leadership

    National AI Strategy Consultation Needs Wider Variety of Perspectives

    Dear members of the Canada AI Strategy Task Force:

    We are Technologists for Democracy, a grassroots advocacy organization based in Toronto.

    We appreciate Innovation, Science and Economic Development Canada’s vision to renew national AI strategy. However — considering the impact that such a strategy will have on various aspects of public life, democratic values, and individual rights — we maintain that public participation is not just a nice-to-have, but a democratic and practical necessity. We strongly urge that something as wide-reaching and important as a national AI strategy not be formulated in a sprint as short as a month. 

    Such a strategy requires a more thoughtful and detailed approach. Expediting the process leads to a higher risk of unforeseen public harm, wasted resources, and loss of public trust. We urge ISED to consider extending the deadline to allow for a wider variety of perspectives and responses from the public.

    We also observe that the current consultation on national AI strategy is highly biased towards an economic perspective, which puts the public at risk of exploitation by industry partners. We understand that ISED’s mandate focuses on economic development and trade; however, national AI strategy must take into consideration a variety of perspectives.

    If Canada wants to compete on the global stage for AI talent, we should lean into our strengths: diversity, inclusivity, and prudence. It is difficult to compete with the level of investment that investors in the United States offer towards AI firms located primarily in Silicon Valley. We must compete in other ways.

    Investors and entrepreneurs alike are aware of Canada’s risk-averse nature surrounding capital expenditure. This is not something to be ashamed of — Canada escaped from the worst effects of the 2008 financial crisis due in large part to caution and regulation surrounding financial institutions.

    Considerations must also be made with the possibility in mind that AI investments have created a bubble. Both in terms of a technical bubble, where technology does not function well enough to deliver the value claimed, and an economic bubble, where investors lose faith in AI companies and pull capital as investing becomes unsustainable. Mitigation strategies should evaluate the value of physical AI infrastructure, such as data centers, in case of bubble collapse.

    KPIs should not emphasize adoption of AI purely for its own sake. Canada should focus on providing resources towards education of how various forms of AI can aid business processes, with special attention towards how AI can be deployed fairly and ethically. We believe measures should focus on:

    • Adoption and implementation of strong governance frameworks which minimize potential harm of AI systems, and ensure fair and transparent outcomes.
    • Employment creation by AI, rather than workforce reduction.
    • Creation of mechanisms for continuous feedback to guide improvements and provide targeted training opportunities.

    AI should not be forced onto society only for the economic benefits that it may bring. AI should serve and augment society, elevating those who would otherwise be invisible, and contributing to a fairer and more inclusive Canada. It is important to keep humans in the loop in such a way that we humans get the most out of machines, as opposed to machines making the most out of humans.

    We reiterate that national AI strategy must:

    • Be made in consultation with a wider variety of perspectives.
    • Not be exclusively focused on the economic implications of AI.
    • Take into consideration the possibility of an AI bubble bursting.
    • Focus on fair, ethical and transparent AI governance frameworks.

    Now is the pivotal time to build an AI strategy that serves all Canadians.

    We must build it right.

    Sincerely,

    Khasir Hean
    [email redacted for privacy]

    Gurpreet Kaur
    [email redacted for privacy]

    Jim Rootham
    [email redacted for privacy]

    Jenny Zhang
    [email redacted for privacy]

    Adam Motaouakkil
    [email redacted for privacy]

    Technologists for Democracy
    techfordemocracy.ca 

  • Speed Cameras are an Effective Technology for Public Safety

    See PDF version here.


    FOR IMMEDIATE RELEASE

    Media contact: Khasir Hean – 226 927-2677 – khasir.hean@gmail.com

    September 25, 2025 (Toronto, ON) – Technologists for Democracy is a grassroots, volunteer-run advocacy organization. We are disappointed by Premier Ford’s decision to ban automated speed enforcement (ASE) cameras. The Hospital for Sick Children and Toronto Metropolitan University have found that ASE cameras in Toronto reduce the proportion of people speeding by 45%, and reduced motor vehicle speeds by 7km/h overall.1

    Reducing motor vehicle speeds is proven to improve safety. When the City of Toronto reduced speed limits from 40km/h to 30km/h, collisions decreased by 28%, and major and minor injuries decreased by 67%.2 Additionally, 73% of Ontarians support ASE cameras in targeted zones.3

    These facts point to the importance and popular support for enforcing speed limits and reducing motor vehicle speeds. We call on the Province of Ontario to commit to public safety by ensuring that drivers obey posted speed limits through ASE cameras to reduce collisions and injuries.

    Instead of spending taxpayer dollars on the removal of existing ASE camera infrastructure, we call on Premier Ford to cancel the upcoming ban on ASE cameras. The Province has not provided details of the provincial fund for spending taxpayer dollars on the removal of ASE camera infrastructure, nor on replacing ACE cameras with safe street infrastructure such as roundabouts or bike lanes. Pedestrian and bike-friendly infrastructure reduce incidences of collisions and speeding in areas where these interventions are applied.

    Separately, there are a number of privacy risks related to ASE cameras. We must ensure that privacy safeguards are built into ASE systems and policies so that data sharing and security best practices are followed, to maintain public trust. For instance, audio, visual and related data should be utilized only for speed monitoring, and not for other law enforcement purposes such as surveillance.  Similarly, license plate data collection and management should be constrained to speed monitoring and enforcement only, and should not be combined with other datasets for a different purpose. ASE cameras should not become multipurpose tools used for purposes other than for monitoring speed.

    We believe that a significant portion of the opposition to ASE cameras is caused by sub-par implementation. We also call on municipalities to improve implementation by focusing on the driver experience – for example, reducing the delay between incidence of speeding and ticket issuance, putting up signage and publishing open data on ASE camera locations, or providing clear feedback to drivers where speeding occurred. The public has a right to know where ASE cameras are, following transparency and open government principles.

    We at Technologists for Democracy believe that ASE cameras are an effective technology for maintaining public safety. In summary:

    • We call on the Province to take a stand for public safety by allowing for the implementation of ASE cameras.
    • We request that the data policies of ASE systems be designed with safeguards to use personal data minimally and protect public privacy.
    • We call on municipalities to improve ASE systems to focus on the driver experience and improve trust.

    1. The City of Toronto Automated Speed Enforcement Program Evaluation. March 28, 2023. https://www.toronto.ca/wp-content/uploads/2023/07/96cc-Automated-Speed-Enforcement-Program-Evaluation.pdf ↩︎
    2. Effect of reducing the posted speed limit to 30 km per hour on pedestrian motor vehicle collisions in Toronto, Canada – a quasi experimental, pre-post study. Fridman et al. February 10, 2020. https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-019-8139-5 ↩︎
    3. CAA survey suggests 73% of Ontarians support speed cameras in targeted zones, even as cams cut down. Gabriela Silva Ponte. July 23, 2025. https://www.cbc.ca/news/canada/toronto/survey-finds-ontarianpurposes,s-support-speed-cameras-1.7592325 ↩︎
  • Letter to Competition Bureau on Algorithmic Pricing

    In response to the Competition Bureau Canada’s call for feedback on algorithmic pricing, we submitted the following letter.

    See original PDF here.


    August 3, 2025

    Re: Algorithmic Pricing and Competition

    Algorithmic Pricing Needs Greater Transparency and Guardrails

    Dear members of the Competition Bureau,

    We are Technologists for Democracy, a grassroots advocacy organization based in Toronto. Our position largely concerns the effects of algorithmic pricing from the consumer perspective.

    Most consumers – 68% – feel that dynamic pricing unfairly takes advantage of them.1 This is not necessarily because of an inherent unfairness of dynamic pricing, but because most companies operate dynamic pricing through a purely profit-oriented lens focused on the short term. There is little thought given to transparency, company trust, or consumer well-being.

    Consumers are particularly vulnerable to downstream effects of algorithmic pricing, especially in markets of essential goods, markets with high barriers to entry and effective monopolies. Without reasonable alternatives, consumers become captive to price increases. Low-income consumers are particularly affected, due to any given purchase constituting a larger portion of their income. Housing is one such essential market – we welcome the Competition Bureau’s current probe into the use of algorithmic pricing for setting rental prices, and would recommend prohibiting the use of algorithmic pricing for housing (recommendation #6 below).

    In addition, we believe algorithmic pricing can lead to market inefficiencies. When companies offer personalized dynamic pricing, with different prices for each individual consumer, it becomes difficult for consumers to compare and recommend prices between competitors, and difficult for competitors to efficiently set prices according to the market. We see personalized dynamic pricing as a dangerous opportunity for larger entities to unfairly exploit their market dominance by reducing the information available to their competitors.

    To limit the harmful downstream outcomes of algorithmic pricing, our recommendations focus on improving transparency and guardrails surrounding algorithmic pricing. In the same way that nutrition facts labels and ingredient lists allow consumers to make informed decisions before purchasing food, labels on algorithmic pricing would allow consumers to make informed decisions before purchasing digital and digitally-enhanced products and services. 

    To improve transparency, we support regulation and enforcement which would require that companies clearly disclose:

    1. Whether or not algorithmic pricing is in effect for a given product or service.
    2. If algorithmic pricing is in effect, whether the pricing model was developed in-house or is outsourced to a third party.
    3. If algorithmic pricing is in effect, whether or not AI/machine learning is used for algorithmic pricing, as opposed to a rule-based model.
    4. If algorithmic pricing is in effect, what data is inputted into the pricing model. For example:
      1. Consumer data such as location, credit score or demographic profile,
      2. Inferred data such as consumer emotional state,
      3. Internal data such as sales counts,
      4. External data such as competitor prices or current weather,
      5. Etc.

    To ensure enforcement of policies, we support: 

    1. Establishing team(s) and process(es) to handle complaints and appeals for policies related to recommendations in this letter.

    In addition to educational regulation, we also support investigations as to the feasibility of:

    1. Prohibiting personalized, dynamic algorithmic pricing from being used in certain market sectors such as those of essential goods and services (including food, housing, and medication).
    2. Regulation of prices such as through a Maximum Retail Price policy, effective in countries such as India.

    We believe that these suggestions will:

    1. Improve consumer trust of algorithmic pricing.
    2. Inform consumers as to what personal data is involved in making a purchase.
    3. Lower the competitive barrier to entering markets already populated with algorithmic pricing models, while still allowing companies to maintain secrecy of the inner workings of proprietary pricing models.
    4. Reduce possible harms on consumers and competitors alike by preventing predatory pricing.
    5. Ensure equal opportunity of access to essential goods and services for consumers.

    We urge the Competition Bureau to increase transparency and implement guardrails on the use of algorithmic pricing.

    Sincerely,

    Khasir Hean
    [email removed for privacy]

    Henry Wilkinson
    [email removed for privacy]

    Jenny Zhang
    [email removed for privacy]

    Cole Anthony Capilongo 
    [email removed for privacy]

    Technologists for Democracy
    techfordemocracy.ca


    1. Gartner Marketing Survey Finds 68% of Consumers Report They Feel Taken Advantage of When Brands Use Dynamic Pricing. December 16, 2024. https://www.gartner.com/en/newsroom/press-releases/2024-12-16-gartner-marketing-survey-finds-68-percent-of-consumers-report-they-feel-taken-advantage-of-when-brands-use-dynamic-pricing ↩︎