QIA | September 29, 2023


Modern technologies power automated property services and provide insights to the pursuit of efficiency, sustainability and satisfying demand and tenant needs. But there are legal factors to consider.

Computer vision and facial recognition used for purposes such as access control, safety, and the detection and prevention of crime are testing the boundaries of the law. Without the individual’s explicit consent, some of these uses may not be permitted. A further complication is that employee consent could be invalid if affected by the employer’s dominant role. Facial recognition also gives rise to the “bystander issue” if an individual’s data is incidentally processed without their consent just to confirm a non-match. These strict rules protect biometric data, which is unique to every person, and it must not be used without a very strong justification.

Digitalisation of the workplace enables more tracking of staff activity which could have implications under data protection and employment law.

The recent push for flexible working has dramatically changed tenant needs – property managers must adapt or fail. Various footfall sensors, whether based on movement, sound, heat or other tracking, enable occupancy and service utilisation monitoring. This data is vital for space optimisation, safety, and meeting demands for a growing list of essential amenities. Better metrics will likely raise the bar for compliance with property regulation, such as occupancy limits and fire safety obligations.

Smart energy features automatically regulate lighting, heat and devices in the office. Whilst anonymous data will often suffice, personalisation features will involve the processing of personal data which is subject to the UK GDPR. For example, a tenant could set their desired room temperature in their online account linked to the building management system. However, such features will likely involve the collection of large amounts of data by the property manager, technology provider and other parties. Compliance issues could be overlooked by property managers who often fail to retain control over how such data is used. 

Predictive analytics powered by machine learning and artificial intelligence (“AI“) features help formulate business strategies, anticipate issues and identify opportunities. Large datasets will enable better results, however, increased data collection carries a compliance burden which must not be ignored. Technology agreements often benefit the technology provider’s use of data at the expense of service users’ privacy.

As public use generative AI is celebrating its first year, we continue to be impressed with its endless capabilities. Architects are able to design schema on command, large amounts of raw data can be organised within milliseconds, video footage can be used to identify trends and information about individuals can be accessed with decisions made in an instant. However, AI technologies give rise to unresolved questions of liability which ought to be addressed.

With all these developments – reviewing contracts, enforcing policies, ensuring appropriate skills and training of staff, assessing the privacy impacts of new technologies from the outset is more important than ever. Getting it wrong could lead to complaints, claims for compensation, aggravated personal data breaches, regulatory scrutiny, and reputational issues.