Showing posts with label new media. Show all posts
Showing posts with label new media. Show all posts

Saturday, May 15, 2010

The value of privacy

I've been reading a book by Daniel J Solove called "The Future of Reputation: gossip, rumor and privacy on the Internet". With the privacy concerns around Facebook now making mainstream news, it's interesting to take stock of just what it is that makes privacy important. Solove has helped clarify that for me.

I would put it like this: privacy is an acknowledgement that humanity - both individuals and society - is imperfect in many ways, and it is a means by which people can cope with that imperfection.

Privacy enables people to hide character defects from others. This sounds like something that shouldn't be done, but it's actually necessary to some degree if we're ever going to make friends with anyone. As Solove says (p66-67): "When intimate personal information circulates among a small group of people who know us well, its significance can be weighed against other aspects of our personality and character. By contrast, when intimate information is removed from its original context and revealed to strangers, we are vulnerable to being misjudged on the basis of our most, embarrassing, and therefore most memorable, tastes and preferences". Privacy prevents us from being judged by people unfamiliar with us solely on the basis of our sensationalistic flaws.

Even if people have perfect information about us, this is no guarantee that their judgements will be fair or accurate. People are irrational beings, by and large, and privacy enables people to hide information about themselves that might lead to stereotyping and prejudice - sexuality and mental illness are some historical examples. Even in cases where a judgement might be accurate - employers discovering if a potential employee has a greater predilection to a fatal illness like cancer is the example Solove gives (p71) - then it's still possible to view this judgement as unfair. Is it fair to deny employment to an individual because they're statistically more likely to get cancer? Should disclosure of such medical information be protected by law? I suspect that in many if not most cases it already is.

Privacy permits violation of social norms. Again, this sounds bad but isn't necessarily so. It is an acknowledgement that there can be conflict between what society demands of an individual and what an individual desires for themself, and a recognition that sometimes the desires of the individual should take precedence. Solove puts it thus (p72): "Most of us desire a limited realm where we might have reprieve from the judgment of others, which otherwise might become suffocating".

Finally privacy helps us to overcome our individual imperfections and grow as human beings. It does this firstly by providing the opportunity to hide our pasts, giving us the opportunity to break free of them, in essence letting us have a second chance at life. Solove again (p73): "Protection against disclosure permits room to change, to define oneself and one's future without becoming 'a prisoner of one's recorded past'".

It does this secondly through group privacy, or the opportunity to have different identities among different people. This is not dishonest, but simply an acknowledgement of the complexity of human beings, that "when you play various roles you're not being artificial or phony. These roles let you accentuate different aspects of yourself" (Arnold Lugwig quoted in Solove, p69). This roleplay enables personal growth, because "people even play roles in which they seem improperly cast, hoping to grow into the part. One plays a role until it fits, becoming transformed in the process." (Solove, p68) This is not possible without privacy.

It would be very easy to go through this list and point out instances where privacy should not apply, and Solove recognises as much, explicitly calling for a balance between the benefits of privacy and the benefits that disclosure of information about others can bring. But in a time when it's not uncommon to hear the claim that privacy is dead, I want to point out just what is being lost if privacy is lost altogether, and question whether our society, and the people within it, can cope.

Is there any society in which nobody has any imperfections that they legitimately do not want disclosed to others? In which nobody ever judges anybody else in a way that is unfair or inaccurate? In which people never have any need to violate the existing norms of society in even the mildest, slightest way? In which nobody ever needs to develop themselves personally beyond the labels already placed upon them by the society around them? I don't think there is. And I don't think there ever will be.

As long as humanity is imperfect, we will need something that performs the functions that privacy currently performs. If we are indeed losing our privacy, then we need to do one of two things. We need to come up with something that currently does for us what privacy had done for us in previous years, and fast. Or else we need to get our privacy back. And fast.

Saturday, August 05, 2006

Down with buzzwords

Bleeurgh, but I guess it had to happen eventually: some yutz is trying to coin the term Web 3.0 to push their pet futurist vision. Basically it's just a re-hash of the concept of a Semantic Web which was buzzing around a few years back and which Clay Shirky poo-poohed. He criticised the model as being unrepresentative of reality, and criticised people pushing the Semantic Web as being stupefyingly negligent of this. His description of how nearly every framing of an example problem that the Semantic Web was supposed to solve actually obscured the problems with it rather than demonstrated its value: "First, take some well-known problem. Next, misconstrue it so that the hard part is made to seem trivial and the trivial part hard. Finally, congratulate yourself for solving the trivial part."

It holds true for the article on "Web 3.0": "Once machines can understand and use information, using a standard ontology language, the world will never be the same." Yes, the Semantic Web could work wonderfully once a standard ontology language exists, but that's the trivial part. Actually creating that standard is the hard part. And the article treats it as trivial: "However, if we were at some point to take the Wikipedia community and give them the right tools and standards to work with (whether existing or to be developed in the future), which would make it possible for reasonably skilled individuals to help reduce human knowledge to domain-specific ontologies, then that time can be shortened to just a few years, and possibly to as little as two years." Sure. And if an infinite number of monkeys could be provided with an infinite number of typewriters then we'd see every work of writing that could ever be written get written. This is pie-in-the-sky stuff, and simply throwing thousands of wikipedia volunteers at it isn't going to magically create "tools and standards" that have only a theoretical existence at best.

Standard ontology for everything from wikipedia volunteers? I've seen wikipedia volunteers arguing over whether they its's best to use the word "kidnapped", "abducted", or "captured" to describe what Hizbollah did to two Israeli soldiers recently, with no clear concensus reached, and objections raised that at least one of the three was misleading. What standard?

Monday, July 31, 2006

A thought for early morning:
The way we we engage our senses most comfortably strongly influences in non-obvious ways the process of how we use and develop communications media.

Phones for instance, work fairly effectively because the way we hear can cope comfortably with that style of media communication. Videophones, with people's heads displayed to one another, do not work comfortably. They require that a person stay still with their eyes focused on a single point if they want their heads displayed, not a particularly comfortable position. Trying to see the other person on a videophone similarly limits a person's freedom of movement in a way that listening to a person on an audio phone doesn't. Until such time as it's as easy to move about while using a video phone as it is with a plain audio phone, then they won't catch on. I don't know how that'd work, though. Holographic technology of some kind?

Then there's television. The style of it seems dumbed-down and simple, and people bemoan this. But a single point of reference in which the visual perception gets engaged can be a tiring experience. More natural is a situation in which people's visual attention is diffuse. A dumbed-down, easily digested form of visual stimulus is the form of media most suited to a diffuse mode of seeing: people's attention can fade in and out and what few details they do miss can easily be filled in owing to the simplistic nature of the overall content.

I also suspect that there's a natural tendency for people to want to give feedback on sensory input. Where this isn't possible you get situations where the input becomes a background thing more usually than a foreground thing, like radio. This may also be why television tends to resort to sensationalism and vivid visual imagery to attract attention: it needs to compensate for the disinterest created by the lack of ability to give feedback on the sensory input provided.

Does this urge for engagement with sensory stimulation rather than passively receiving it exist? Is it a natural human urge? I honestly have no idea. I suspect that something like it exists, and understanding it would allow deployment of new, much more effective media devices the likes of which we have not yet seen.

Tuesday, July 25, 2006

History of Social Networking Sites and History of Operating Systems - An Analogy

Desktop OS history, starting point: many different systems, all distinct, not interoperable, none particularly standing out as the One and Only.

Social Networking History analogy: blogging goes mainstream. Blogspot, TypePad, Livejournal, all with their own style, none particularly dominant.

Desktop OS history: somehow a technically loathsome OS becomes not just the dominant player, but soon transforms itself into the only game in town even as people who think an OS shouldn't, you know, absolutely suck try to wrap their heads around the fact of its dominance. Microsoft Windows entrenches itself as a monopoly.

Social Networking History analogy: Myspace. Enough said.

I would have to say that , to carry the OS analogy, Vox currently reminds me a lot of OS X. It's designed for ease of use and has a technical superiority that Myspace lacks. I hope it doesn't close itself off the way Apple tends to do with its software and hardware.

What I think Social Networking actually needs is a kind of Open Source movement. There needs to be some sort of development of an alternative to the closed alternatives like Myspace - an alternative which allows different Social Networking hubs' users to interact with one another, or even change hubs if they like. Unfortunately, the current social networking companies' approach looks to me like they're working on the assumption that users need to be put in a kind of proprietary lockdown, getting their social fix from their company's site and their company's site alone. I hope that way of thinking changes, or Myspace looks poised to become the Microsoft of the social networking world.

On the other hand, if that means Myspace's new owner, News Limited, becomes as hated in future as Microsoft is hated today, I might actually go in for that - no fan of Rupert Murdoch, me. But the cost would probably be too great.

Wednesday, July 19, 2006

Web 2.0

O-Reilly -- What is Web 2.0

Web 2.0: it's one of the most popular tags on del.icio.us, generates huge amounts of interest among the people interested in the business side of tech companies, but there's a strong undercurrent of sentiment in geekdom that the whole thing is just one great big marketing gimmick. Personally, I think that there's definitely a "there" there, but it's highly misunderstood.

The O'reilly article above explains a fair bit, but the way we arrived at even having a concept of Web 2.0 is what interests me. From some of the commentary around you'd think it's a brand new, revolutionary conception of the way the Internet works. It isn't.

One of the best analogies I've seen for Web 2.0 is that it's like version 2.0 of a software application: promising to do all the things that version 1.0 was supposed to do but couldn't quite manage. Let's apply the analogy to the actual development of software as I understand it...

ShinyAppv1.0 has been released. Work has started on the next version. In the open-source world you actually get to see this process happening. Features are added, bugs are ironed out, new things are tried. At some arbitrary point a feature freeze sets in, and there's a push towards a final release. At some arbitrary point - perhaps a set date, perhaps an official pronouncement that all major bugs have been ironed out - the product is deemed "complete", and ShinyAppv2.0 appears. To the unclued in user, it may look like the new, improved product has appeared full-formed like Athena being birthed fully-grown from Zeus' skull. In reality, it's the culmination of a long series of development of various aspects of the software, some of which may still have been incomplete at the time of the "official" software release.

All Web 2.0 really is, is the maturation of several technological and sociological trends that have been developing for years, but which are just now coalescing into a comprehensible whole. Web 2.0 isn't just a product label you can slap onto your tech business in order to make a profit. it's a...let's see: it's a current snapshot of the significant sociological and psychological changes in human online interactivity that are occurring as a result of recent technological advances in many-to-many communication media (collectively referred to as "the Internet") as well as the discovery of new methods of using the older technologies. Business people shouldn't be discussing it; social scientists should.

The O'reilly crowd misses this point that their business crowd really should be made aware of: these trends are continuing, and the Next Big Thing(tm) may be the further evolution of a trend that nobody has anticipated because everyone's trying to conform to some arbitrarily defined, rigid standard of "where the web is right now", aka Web 2.0.

The other thing missing from their analysis is of a technological advance that changes the dynamic of the web significantly: they talk about users creating and consuming each other's content a lot, but neglect to mention the advances in software technology that gives users significant ability to not receive content that they don't want to receive: I'm thinking of the Firefox extension Adblock here. What's a business-person to do when the preferred method of revenue-raising on the web today - online ads - is getting blocked out by a growing number of users? And then there's the Greasemonkey Firefox extension, which provides a whole framework for giving the user control over how they choose to rearrange a content-provider's content for their own consumption....

Web 2.0 is primarily a marketing term, but it really shouldn't be. It ought to be the jumping off point for determining how the Information Revolution is affecting, and will continue to affect, modern society.