business camera coffee connection
How many of you, between waking up and your first cup of hot, caffeinated beverage, told the world something about yourselves online? Whether it be a social media status update, an Instagram photo or story post or even a tweak to your personal profile on LinkedIn. Maybe, yes, maybe no, although I would hedge my bets that you’ve at least checked your notifications, emails or had a scroll through the newsfeed.
Another way to view this question would be: how many of you interacted with the internet in some way since waking up? The likeliness is probably fairly high.
In my first blog looking into the topic of Psychodynamics, I solely focused on our inseparable connection to modern technologies – smartphones, tablets, etc. – and the access that these facilitate for us to interact with the society and the world. For most of us, this is an undeniable truth of daily life. A major nuance of this relationship between people and technology and one that I think we are probably somewhat in denial about is the security and privacy of our personal information online.
To the Technology Community at large, it’s no secret that our personal data is proliferated by governments and billion dollar corporations on a constant basis. Whatever information – and more importantly, information on that information – that’s desired, going to the highest bidder, or for the best market rate. Facebook, for instance, doesn’t sell your information outright. That would be completely unethical and see devaluation to their brand trust. What it does is sell access to you, to the advertisers and large corporations connected through it, which in turn gives them valuable, consumer data, to advertise, target and sell back to you based on your online habits.
My reasoning for raising this topic in regard to psychodynamics and technological-behavioral patterns is for consultants and tech professionals to consider what data privacy means to our/your valued clients.
I was fortunate to participate this past week in a seminar hosted by the University of New South Wales’ Grand Challenges Program, established to promote research in technology and human development. The seminar featured guest speaker Professor Joe Cannataci, the UN’s Special Rapporteur on the right to privacy, who’s in town to discuss with our Federal Government recent privacy issues, specifically amid concerns about the security of the Government’s My Health Record system (see full discussion here on ABC’s RN Breakfast Show) Two key points raised during the seminar, and from Professor Cannataci’s general insights were:

  1. Data analytics targeting individuals/groups are focused largely on the metadata, not the content data of what an individual or group of individuals is producing. What this means is that businesses are more likely to not look at content as scalable unless there are metrics and positive/viral trends in viewership/content consumption patterns.
  2. Technology, it’s socialisation and personal information privacy issues are no longer specific to a generation — “boomers”, “millennials” — or context (though countries like China and Russia prohibit and filter certain URLs and web services). That is to say, in the daily working routine of an individual, their engagement with technology and the push to submit data to get a task done may, in some instances, formulate an unconscious processing pattern over time where we get used to sharing our personal information, adopting the mindset “well, I have nothing to hide”. I believe we’ve likely all been guilty of it before. Jane might not think about how sharing her client’s files with her colleague Judy to assist with advising on a matter may put their employer in breach of a binding confidentiality agreement.

 
My recent projects saw heavy amounts of content extraction and planning, not immediately considering the meta-data trends and what business departments likely needs were for this content, focusing on documented business processes over the data usage patterns. Particularly working with cloud technologies that were new for the given clients, there was a very basic understanding of what this entailed in regards to data privacy and the legalities around this (client sharing, data visibility, GDPR, to name a few). Perhaps a consideration here is investigating further how these trends play into and, possibly, deviate business processes, rather than look at them as separate factors in information processing.
Work is work, but so is our duty to due diligence, best practices and understanding how we, as technology professionals, can absolve some of these ethical issues in today’s technology landscape.
For more on Professor Joe Cannataci, please see his profile on the United Nations page.
Credit to UNSW Grand Challenges Program. For more info, please see their website or follow their Facebook page (irony intended)

Category:
Case Study, Internet of Things, Security, Technology
Tags:
, , , ,

Join the conversation! 1 Comment

  1. Michael, another very incisive and thought-provoking piece about the people aspect of technology. The use of technology is proceeding at such a rapid pace that the implications/ethics of using or misusing said technology is a distant afterthought once the impact has occurred. It would be an interesting area of investigation to understand how organisations try to put the ‘genie back into the bottle’ by setting policies and procedures to manage the impact of the change to our work lives due to technology use. Looking forward to the next blog.

Comments are closed.