Orwell vs. Huxley: two dystopian worlds, compared

In 2009 Stuart McMillen, famed Australian comic artist, published a drawn rendition of a short passage from Neil Postman’s Amusing Ourselves To Death.

The passage compares the radically different worlds depicted by Orwell in his “1984” and by Aldous Huxley in his “Brave New World.” Both novels show an Earth whose inhabitants have been rendered helpless and brainwashed, and are considered the quintessential dystopian novels. The term Big Brother, after all, was coined by Orwell for his novel. Yet they depict a radically different approach to enslave humankind.

I’ll leave you to the word of Postman and to the wonderful, if not a little spine-chilling, imagery of McMillen.

What Orwell feared where those who would ban books.
What Huxley feared was that there would be no reason to ban a book, for there would be no one would want to read one.

Orwell feared those who would deprive us of information.
Huxley feared those who would give us so much that we would reduced to passivity and egotism.

Orwell feared the truth would be concealed from us.
Huxley feared the truth would be drowned in a sea of irrelevance.

Orwell feared we would become a captive culture.
Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy and the centrifugal bumblepuppy.

As Huxley remarked in “Brave New World Revisited”, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “Failed to take into account man’s almost infinite appetite for distractions.”

In “Nineteen Eighty-Four”, people are controlled by inflicting pain.
In “Brave New World”, people are controlled by inflicting pleasure.

In short, Orwell feared that what we hate will ruin us.
Huxley feared that that what we love will ruin us.

It is worth noting that Huxley, 26 years after publishing his novel and with World War II having happened in between, wrote an essay entitled “Brave New World Revisited”, in which he analyzes how correct he was in his prior assumptions.

Both novels, and possibly also Huxley’s and Postman’s essays mentioned above, should be — in my humble opinion — read by anybody who has any interest in the future of humanity, even though it might mean having to deal with uncomfortable truths.

Add money to your likes: Flattr’s microdonation system

Many social networks today employ the concept of “favoriting” items: images on Flickr, Instagram and 500px; songs on SoundCloud; videos on YouTube and Vimeo; tweets on Twitter; repositories on GitHub; and so on….

When you “favorite” or “like” something, you’re essentially telling the author that you’d like more of that. Yet, when it comes to creative endeavors, it’s money that makes the difference: not only it helps cover the costs of production, but it also frees up time to produce more. That’s why many of us resort to selling prints, crowdfunding and other ways of raising money.

One of such other ways is Flattr. And it’s G-R-E-A-T.

The idea behind the Sweden-based company is both simple and genius: instead of actively sending money to an author, which can be complex and, in some cases, awkward, you can prepay your Flattr account using pretty much any credit card (in addition to Paypal) and the system does the rest. All you have to do is “connect” your Flattr account to your social network accounts, which usually only takes a couple of clicks for each. This allows Flattr to track your likes and pay creators.

The only slightly more complicated one is Twitter, but it takes just a couple clicks more: since the chirping network changed its terms and conditions, Flattr cannot directly track your favorites. The problem is easily worked around by using SuperFav: just connect it to both Flattr and Twitter, and you’re good to go.

Afterwards, when you favorite or like something on any of the connected social networks, that thing is said to be “flattr’d” by you and the author gets some money from your balance. You don’t have to do anything else, just top up your Flattr funds once in a while and then simply use your social networks as before. Neat, eh?

But it gets better. You can support as many artists as you like, and you don’t pay a cent more than what you want to. You can top up your Flattr funds as much as you want, and then set a monthly budget. At the end of the month, your monthly budget is equally divided between all the artists whose items you favorited or liked. You always know exactly how much you spend.

To make it even clearer: let’s say that you top up €15 and set your monthly balance to €5. During the first month, you “flattr” 5 authors, by liking their contents: each one gets €1. The next month you “flattr” 2 authors: each one gets €2.50. The next month you “flattr” 8 authors: each one gets €0.62. It doesn’t sound like much, but it adds up; and a little is better than nothing.
(Technically speaking there is a 10% fee that Flattr rightfully retains when paying credit out; but that’s of concern only to creators, not supporters.)

Why sign up as a supporter, you ask? Because you like what authors make and feel that their productions are worth a few cents. It’s great to get thousands of views or dozens of favorites on a photo, or 110,000 views on a video. But if you like those things so much, why not take a step further and buy prints, buy books or, even more simply, Flattr? And of course, you can sign up as a contributor too, so you can both give and receive.

And while you’re at it, give it a try by using the Flattr this button right on this post.

Thanks!

Italian luddites: the downfall of a country living in the past

If you were to describe my country, Italy, as a country fearful of change, you wouldn’t be too far off from the truth. If Italians could live under a bubble preventing time from passing, most of them would jump at the opportunity. I have come to the conclusion that most of my fellow countrymen are luddite by nature.

Technology is seen as something to be feared, rather than embraced. Something new comes along, and people of all ages — including part of the youth — will complain that it’s unnecessarily complicated, that things worked just as fine before, and that “back then” nobody was forced to learn anything new. I have wondered why people think this way for a few years now, and I think I’ve come to the conclusion that it has to do with history.

Even today, a hundred and fifty-one years after the unification of the country, most Italians don’t really feel like they are Italian. They are more likely to label themselves as coming from a certain region, city or even neighborhood. The North has been blabbering about independence for decades now, and the South is still stuck in the grip of organized crime, the mafia and its cousins sometimes being more popular and better-considered than the State. Indeed, the roots for such criminal organizations can be traced back to the bandits who fought against the forced “Northernization” of the peninsula — more specifically, the so-called Piemontesizzazione, as the first King of Italy just exported the bulk of Piemontese laws to the rest of the newborn country — immediately after the unification.

In a sense, that’s why Italians still today consider “the State” to be inherently evil and that it should leave people alone instead of meddling with their lives. You seldom find someone who thinks that s/he, as a citizen, is him/herself part of “the State”. Rather, most people will complain about “the State” and, why not?, rip it off if possible: after all, from their point of view it’s just reciprocation.

For this reason, each and every change is perceived as preposterous, required by the evil State for the sole reason of complicating the citizens’ life, not unlike the way a big, seemingly almighty cat plays with a tiny mouse solely for its own amusement.

But it’s with technology that Italians show their chronic opposition to change. Most people over 50 have no clue whatsoever about computers. Unless they are introduced to them by some younger member of the family, or through some mandatory course on their workplace, most senior citizens will be completely oblivious to computers. Even among those who do use them, most of them will remain antagonistic to the machine.
Even more worrying is the fact that many young people are virtually as uninterested to computers as such, save for the fields in which they are deemed useful from their point of view: (illegal) file sharing, homework (and plagiarism), social networking, porn and the like. The interesting thing here is that the same young people spend most of their time with a smartphone in their hands, yet refuse to learn the basics of computing. I personally know an eighteen-year-old who claims that she never really learned how to use a computer because she never found a use for them.

Most of my foreign readers are probably shocked at this point, but see, the sad truth is that in Italy the internet is not necessary to carry on with your daily life. Nobody expects you to have an email address, or to submit documents online. I know doctors who proudly take note of their appointments on a dear old paper calendar, rather than using a computer, an iPad, a smartphone or even a measly electronic “data bank” from the 90s. They are completely oblivious to the capabilities that a digital system can provide — such as keeping an easily searchable long-term log of appointments, or cross-referencing notes — because they are not familiar with the possibilities, and even if they were, they wouldn’t want to spend/waste any time learning how to use the system.

In this country, most companies don’t even have a one-page website. Those who do, seldom update it; it quickly turns into a stale flyer, but they don’t care. Who goes to the website, anyway? After all, if clients want some information they’d better just call: writing to a company’s e-mail address almost invariably results into never receiving a reply, or immediately receiving a notification that the recipient’s mailbox is full, a clear sign that it’s been left unchecked for the longest time.

When it comes to money, Italians’ fear for change goes into overdrive. Given the incredible level of corruption in the country, there have been feeble attempts at reducing the maximum amount that can be paid in cash, forcing any higher-value transaction to be carried out through means that leave a trail. Recently, this limit has been lowered to a thousand euros. One would expect that the strongest opposition to this would come from lobbying entrepreneurs, but no: the ones who complained the most were retired senior citizens. The new limit would prevent those among them who make enough (and the numbers are getting fewer and fewer) from picking up their whole pension in cash in a single visit to the post office. Of course, having it deposited to a checking account would solve the problem immediately, but many people in Italy do not have a checking account altogether, in part due to the fact that they have the highest fees in all of Europe. Indeed, many people only open up one when they are required to, such as when their employers insist that they are paid with a direct deposit, or when they need to purchase a house and need a mortgage.
Credit card usage is also lower than most of Europe, as many people simply don’t trust them (or lack access to them, if they have no checking account). I know people who only use them at ATMs to withdraw cash, which — albeit useful in emergencies — is quite a silly thing: why not just use them directly to pay in stores?

When I read that Sweden is starting to consider the wholesale (pun intended) elimination of cash as most Swedes use other means of payments and micropayments, I was stunned. That will never happen here. The people, the commoners if you will, would object too strongly, failing to see that it would actually lead to a greater accountability that would reduce most of the corruption. It would not make it entirely impossible to use money for bribes, of course, but it would require more careful planning than just not releasing an invoice or a giving out a receipt to clients. That alone would be an immense improvement, but then again, it requires a paradigm shift that most people are simply not willing to take out of laziness, rather than out of genuine concerns about privacy and tracking.

About a month ago, my region switched off all analog TV transmissions, finally entering the all-digital era. This was supposed to happen two years ago, but it kept being postponed over and over, in part due to the political agenda, and in part due to the fear that people would not be able to survive — metaphorically speaking, of course — the switch. It’s not hard: if you have a new TV, you’re already set; if not, you need to get a cheap converter box that you connect between the antenna and the TV. In some cases, as ironically happened to my very own household, you may need to call and pay a technician to replace and/or re-aim your antenna to improve reception. The government, years ago, even started a controversial campaign that allowed people to buy converter boxes at a discount, effectively semi-subsidizing the purchase of these devices. Yet, even today, many people are incredibly confused about the whole matter, and the refrain is always the same: why does my grandma need to learn how to use a converter box with a different remote? why does my grandpa have to spend money to get his antenna replaced? And mind you, these are the same people who complain that there’s nothing on TV. They may have to shell out some cash in some cases (though for most households the expense is simply the cost of the digital receiver, which retails for prices as low as €15), but they would get many more channels to watch for free after that. In most cases, moreover, the switch would be so simple that any nephew or grand-daughter can explain the eldest how to proceed.
The people who complain about how “the government did this to make us spend more money” (without realizing that the money spent, if any, goes to private companies, such as stores and antenna technicians) also fail to realize that the frequencies that get released will be auctioned off for mobile broadband, which will improve the availability of Internet access in areas currently not covered by DSL.

But, then again, who needs the Internet in Italy? The “Internet use in households and by individuals in 2011” report by Eurostat tells a fairly discouraging tale. A note for non-Europeans: “EU27” refers to the whole European Union, which includes 27 Member States (Austria, Belgium, Bulgaria, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, Netherlands, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden, United Kingdom) as opposed to “Eurozone”, which refers to the 17 Member States currently using the Euro as their currency.

Whereas 73% of the households in the EU27 had Internet access in 2011 and broadband availability was at 68%, only 62% of Italian households have Internet access and barely 52% have broadband. This is in stark contrast with other Western European countries such as France (76% and 70%, respectively), Belgium (77% and 74%), Germany (83% and 78%) or the Netherlands (94% and 83%.) What’s most amazing is that Greece jumped from 25% and 7% in 2007 to 50% and 45% in 2011, and Romania jumped from 22% and 8% to 47% and 31% during the same time span. Italy’s increase is still remarkable (43% and 25% to 62% and 52%), but we remain steadily behind the average.

It gets worse when the actual usage of the Internet, rather than its bare availability in households, is taken into account. An average of 71% of EU27 citizens used the Internet within the 3 months before the survey, 73% used the Internet within the 12 months before the survey, and 24% never used the Internet. The report doesn’t state whether this means never used it at all, or never used it within the past 12 months; in any case, this is only marginally relevant for the sake of the analysis.
In Italy, only 54% used the Internet within the last 3 months and 57% within the last 12 months, while 39% never did. Comparatively, in France these values are 78%, 80% and 19% respectively, in Germany they are 81%, 83% and 16%. Scandinavian countries lead the chart, with Sweden chiming in at 93%, 94% and 5%, and Norway at 93%, 94% and 5%. Iceland shows an even higher Internet penetration, but I’m concentrating on mainland Europe here.

The important fact here is the number of people who never used the Internet. Italy’s value is 39%, the highest in Western Europe after Greece (45%) and Portugal (41%), while the EU27 average is 24%. That’s almost half as much.
Moreover, only 51% of Italians access the Internet at least once a week and only 49% do so daily, while in Germany these values are 77% and 63% respectively. Unsurprisingly, 82% of Norwegian users access the Internet daily, and 91% do so weekly.

Italians are also not very keen on purchasing goods or services over the Internet. Compared to an EU27 average of 43% over the past 12 months, only 15% of Italians carried out economic transactions over the web. This is an incredibly lower value compared to France’s 53%, Germany’s 64%, the Netherlands’ 69% and Norway’s 73%.
The report doesn’t tell the reasons for this negative achievement, but I think I can elaborate a little bit on that. As I’ve said in the first part of this article, Italians are somewhat afraid of change and are particularly opposed to payment methods other than cash. However, while you can enter a store and pay with notes and coins, you cannot do so over the Internet unless you choose cash-on-delivery options, which are normally more expensive. This, together with the ancestral fear of frauds, lack of widespread Internet access — Italy had one of the strictest law on public wi-fi that simply killed the so-called “Internet cafés” —, generalized computer illiteracy, very high shipping costs and incredibly complicated bureaucracy, effectively hinders any possibility of widespread adoption of electronic commerce. This is not to say that e-shops cannot thrive in Italy; many of them do (and I have first-hand experience of this, because in 2008 and 2009 I worked in a small store that also sold its products online), but most of the buyers are usually returning customers. It’s hard to make a company grow in such an environment, and online businesses shut down daily.

All of this unfortunately triggers a chain reaction: since few people use the Internet and therefore few people will buy online, few companies will be eager to make business online (and the few public authorities will invest in letting users deal with them over the web, given the investment required and the current state of the economy.)

In the EU27, 41% of people interacted with public authorities over the Internet in the last 12 months, but only 22% did so in Italy. The pattern repeats again: France chimes in at 57%, the Netherlands at 62% and Norway at 74%.
Italy’s percentage is only about half of the average, and that’s frankly not surprising. Our bureaucracy is so heavy and complex that moving even if new material were handled digitally, old certificates will probably never be transposed to the 21st century.

Again, I can provide first-hand experience: my parents live in Chieti but they married in my mother’s town, Vasto, which is located about 75 kilometers away. They need a marriage certificate, and the only way to have it is to go to the city hall in Vasto and request it there. There is simply no way to request it at the local city hall and have them get it via fax or something like that, let alone obtaining it directly online. Moreover, since it’s a semi-private act, the request cannot be delegated to some relative who lives there, so they have to be there in person. The most ironic part of this is that not only this will take the better part of a day and money to pay for gas and highway tolls, but the certificate itself will not even be free. But, once again, since very few people would request this kind of data online, there is no reason for public authorities to invest into a massive digital upgrade.

This whole chain reaction leads to an unpleasant conclusion: one of the reasons for Italy’s economy downfall is this country’s inability to change and become modern by embracing technology. What’s even sadder is seeing hordes of youths, the same youths who fiddle with their parents-funded smartphones all day long, puzzled in front of a computer screen. How can we expect things to improve if our future doctors, lawyers and entrepreneurs are confused by paragraph styles in word processors?

God is an atheist

God here.

First, I do not exist. The concept of a 13,700,000,000 year old being, capable of creating the entire Universe and its billions of galaxies, monitoring simultaneously the thoughts and actions of the 7 billion human beings on this planet is ludicrous. Grow a brain.

Second, if I did, I would have left you a book a little more consistent, timeless and independently verifiable than the collection of Iron Age Middle Eastern mythology you call the Bible. Hell, I bet you cannot tell me one thing about any of its authors, their credibility or their possible ulterior motives, yet you cite them for the most extraordinary of claims.

Thirdly, when I sent my “son” (whatever that means, given that I am god and do not mate) to Earth, he would have visited the Chinese, Japanese, Europeans, Russians, sub-Saharan Africans, Australian Aboriginals, Mongolians, Polynesians, Micronesians, Indonesians and native Americans, not just a few Jews. He would also have exhibited a knowledge of something outside of the Iron Age Middle East.

Fourthly, I would not spend my time hiding, refusing to give any tangible evidence of my existence, and then punish those who are smart enough to draw the natural conclusion that I do not exist by burning them forever. That would make no sense to me, given that I am the one who withheld evidence of my existence in the first place.

Fifth, I would not care who you do or how you “do it”. I really wouldn’t. This would be of no interest to me, given that I can create Universes. Oh, the egos.

Sixth, I would have smited all evangelicals and fundamentalists long before this. You people drive me nuts. You are so small minded and yet you speak with such false authority. Many of you still believe in the talking snake nonsense from Genesis. I would kill all of you for that alone and burn you for an afternoon (burning forever is way too barbaric for me to even contemplate).

Seventh, the whole idea of members of one species on one planet surviving their own physical deaths to “be with me” is utter, mind-numbing nonsense. Grow up. You will die. Get over it. I did. Hell, at least you had a life. I never even existed in the first place.

Eighth, I do not read your minds, or “hear your prayers” as you euphemistically call it. There are 7 billion of you. Even if only 10% prayed once a day, that is 700,000,000 prayers. This works out at 8,000 prayers a second – every second of every day. Meanwhile I have to process the 100,000 of you who die every day between heaven and hell. Dwell on the sheer absurdity of that for a moment.

Finally, the only reason you even consider believing in me is because of where you were born. Had you been born in India, you would likely believe in the Hindu gods, if born in Tibet, you would be a Buddhist. Every culture that has ever existed has had its own god(s) and they always seem to favor that particular culture, its hopes, dreams and prejudices. What, do you think we all exist? If not, why only yours?

Look, let’s be honest with ourselves. There is no god. Believing in me was fine when you thought the World was young, flat and simple. Now we know how enormous, old and complex the Universe is.

Move on – get over me. I did.

God

(I didn’t write this; I found on the web, but I wholeheartedly agree.)

In memoriam: Steve Jobs, 1955-2011

Steve is gone. I waited a few days before writing about it, yet it still feels unreal. The man who created Apple, the company that essentially created modern computing and more, is gone.

It was widely known that he was sick, and those who follow Apple-related news and rumors had a feeling that it was only a matter of time when he decided to step down as Apple CEO in late August. He had said that, should his health not allow him keep that position any longer, he would do so; and a month a half ago, he did. We knew it was coming, there’s no denying that. The forum posts at that time were ripe of surreal optimism, as if we were all thinking the same thing but refused to speak it out loud: he may not be CEO, but he’s still in the board, so he will still direct the company’s future. Michael Grothaus of TUAW posted his first-hand experience with Tim Cook years ago and stated that he would be a good CEO, though of course he would be different from Steve; that was proven during presentation of the iPhone 4S, just two days before Steve’s departure. In hindsight, however, the people on stage certainly knew about Steve’s situation, and that explains the lack of enthusiasm. That was nothing, however, to what came later.

Indeed, the news of Steve’s death echoed through the world with unexpected force. I learned about it in the early morning, Central European Time, of October 6th. As usual, I had woken up and grabbed my iPhone 4 while still in bed, to check my mail and the news. It was on the homepage of Repubblica, an Italian newspaper; I figured it was a mistake, or perhaps I had misunderstood. I checked the online editions of other Italian newspapers: it was on their homepages too. I checked other international sources, and finally landed on MacRumors. I won’t deny that it hit me like a freight train on full throttle. Steve Jobs, dead. I imagined him peacefully resting on his bed, with his family around him, his John Lennon-style rimless round glasses on his bedside table. He was a Buddhist, so I had to refrain from thinking of him going to the gates of heaven and suggesting to god to use an app to sort souls out more efficiently. Again, it hit me: there would be no more “Stevenotes,” the nickname given to his keynotes. No more “reality distortion fields.” No more “one more thing.”

Worldwide press wrote about his life and his death. He has been defined a visionary, a man ahead of his time. His 2005 Stanford Commencement address has been widely referenced, because it shows what Steve was all about. He was a great speaker. He was able to share dreams.

A few voices rose against this media frenzy about him. He was just a skilled salesman, some said. He should not be hailed as a god on earth, because the company he founded is just another multinational capitalist group that abuses poor workers in remote areas of the world. These people forget that Apple last year started paying direct subsidies to Foxconn employers, and that Foxconn also manufactures products for other companies, such as Dell and HP. Moreover, there is nothing inherently wrong with being a skilled salesman.

Steve, however, was more than a salesman. He truly lived the American dream (his personal CV on mac.com, many years ago, stated something like “founded Apple in my garage; sold my VW minivan”) and his company literally anticipated the times. Its concept of a Knowledge Navigator, demoed in this video made in 1987, is stunningly allusiva to what the iPad would be in 2010. And 1987 was three years before the very first text-only web browser appeared. Indeed, the world wide web only saw the light of day in late 1990. This is what Steve’s vision was all about. He did not think of products: he came up with concepts, ideas, plans. The products Apple makes are merely tools to enable people to do what they want to do as efficiently as possible. In a 1998 interview with BusinessWeek, he stated: “A lot of times, people don’t know what they want until you show it to them.” Apple’s success history has proven him right many times, even when it looked like it was swimming against the tide. The rumors of Apple going bankrupt – something that was indeed almost bound to happen during Steve’s absence from the company  in the early 1990s – never ceased until recent times. Many analysts claimed the iMac was doomed to failure as it had no serial or parallel ports, only those new USB ports that meant virtually no device could be connected to it. A few years later, it was difficult to find peripherals without a USB connector. The same computer dropped the diskette drive. Apple built what people needed before those same people even thought about it, quite like John Ford, who claimed: “If I had asked people what they wanted, they would have said faster horses.”

I don’t know what will happen to Apple now. As an avid Apple user since 2001, and as an iOS developer (my app, Quick Whois, is available on the App Store), I am a little bit concerned. I switched to Mac before it created its “Switch” campaign, and I haven’t looked back once. That is not to say that I despise other systems: like any other craftsman, I realize that each job requires the right tool. I prefer to work on my Mac, however, and I am naturally interested in knowing whether it will keep innovating or not.

Some sources state that Steve left four years worth of plans for the company. As great as I think Steve was, I doubt that that’s the case. Certainly he trusted his closest colleagues and shared his vision with them, and I wouldn’t be surprised if, ten years from now, I will be posting on a blog – or whatever else we will post on then – that once again, Mr. Jobs had gotten it right. However, it is hard to believe that he left plans for future products. While Apple has indeed been ahead of the times and renewed the industry countless times, the technology Apple uses also depends from other manufacturers. Yes, perhaps next year Apple will introduce a MacBook Air based on an ARM-based CPU, but that’s as far as his “plans” could have gone. We don’t know, for instance, how long (and indeed, whether) copper-based Thunderbolt will be popular enough for people to consider switching to fiber optics. Still, I won’t deny that I find it somewhat amusing to imagine Steve as a real-world Hari Seldon, laying down future plans for humanity thanks to his studies in psychohistory.

In any event, I think that there truly was no better tribute to Steve’s influence than having tens, or indeed hundreds, of millions of people learn about his departure through a device he envisioned and blessed, whether it was a Mac, an iPhone, an iPod Touch or an iPad. Or, why not, a Newton.

And secretly, quietly, I, like many others, weak-heartedly keep hoping that this is just a stunt, yet another example of the Reality Distortion Field he created, and that he will be back on stage for an encore: “Oh, and one more thing…”

Steve Jobs 1955-2011

Here’s to the crazy ones.
  The misfits.
    The rebels.
      The troublemakers.
        The round pegs in the square holes.

The ones who see things differently.
They’re not fond of rules.
    And they have no respect for the status quo.

You can praise them, disagree with them, quote them,
    disbelieve them, glorify or vilify them.
About the only thing you can’t do is ignore them.
    Because they change things.

They invent.   They imagine.   They heal.
  They explore.   They create.   They inspire.
    They push the human race forward.

Maybe they have to be crazy.

How else can you stare at an empty canvas and see a work of art?
Or sit in silence and hear a song that’s never been written?
Or gaze at a red planet and see a laboratory on wheels?

We make tools for these kinds of people.

While some see them as the crazy ones,
    we see genius.

Because the people who are crazy enough to think
they can change the world, are the ones who do.

Photographic trends I just don’t understand

It is no mystery that I have a passion for photography. Having published two books and posting regularly on my Flickr stream, and knowing the theory of optics in addition to just snapping around, I think I know what I’m doing. Mind you, this does not mean I consider myself an artist. It may sound cliché, but I am strongly convinced that artist is a definition that others should cast upon you, rather that something you call yourself. In fact, despite what I am often told, I do not feel like my photography is that good. It’s not false modesty: I really don’t think so.

However, ever since the introduction of cheap compact cameras (and, god forbid, cheap reflex cameras), photography became mainstream. There is nothing inherently wrong with it – the more the merrier, right? – yet there are some trends in photography that I simply do not understand, and some that are just plain bad. Needless to say, these annoyances are most often perpetrated by hipsters or (gasp!) wannabe hipsters. Now, it has to be clarified that my concept of hipster includes not just the traditional, American-ish hipster, but more generally all those “subcultures” – trust me, quotes were never more appropriate – that strive to be alternative and ultimately fail to be unique. This includes, admittedly due to my cultural vantage point, the decadent leit-motif that seems to permeate the life of Italian teenage girls and young women. I may write specifically about this matter, as it’s not specific to photography.

So, without further ado, let me present a roundup of the most annoying trends in photography today. It goes without saying that this is merely my personal opinion.

Continue reading “Photographic trends I just don’t understand”

Languages: ambiguous parsing

There is one reason computers are great at numbers and awful at languages: the latter are difficult to parse. While complex mathematical operations can be carried out in a well-known order, parsing text can be exruciating difficult even for humans.

This is especially true for languages — such as English — that allow long sequences of words to be joined together without prepositions, and that use the same word both as a noun and as a verb.

Continue reading “Languages: ambiguous parsing”

Languages: the strange case of Pirahã and Aymara

In my last post, I wrote about the connections between language and thought, ie. linguistic relativity / determinism.

In today’s highly globalized world, languages get mixed and evolve at a much faster pace than ever before. English, for instance, is no longer only divided into British, American, Canadian and Australian English; we could say that there is a variety or dialect of English for any other natural language: Spanglish, Chinglish and so on. When French was the de-facto lingua franca of diplomacy (and, by extension, Western Europe), it was not substantially modified by other local languages; yet when English replaced it, after World War I and especially after World War II, it started changing immediately.

English, in particular its American variety, was not only originally used for international diplomacy; rather, as the United States rose a superpower in many fields (technology, business, etc.), one could argue that its language became widespread from the bottom. The average Joe in most other Western countries was exposed to American words: they wore blue jeans, they put coins into juke-boxes, they went to a bar. English words became commonplace over time, and this ultimately led to the creation of what could be easily considered a series of creoles that are, for the most part, mutually intelligible.

Continue reading “Languages: the strange case of Pirahã and Aymara”

Languages: linguistic relativity, words vs. thought

One of the most intriguing concepts in linguistics is the so-called Sapir-Whorf hypothesis, or linguistic relativity principle. Simply put, it states that the language we speak can influence the way we think. Another common name for this theory is linguistic determinism. There are some subtleties in the usage of these different names (no pun intended), but in order to avoid confusing them and giving wrong information, I’ll refrain from attempting. There are many resources online about the details of this topic for those who wish to delve deeper. For the sake of this post, I will freely use the terms interchangeably.

Anybody who studied a foreign language, even without reaching fluency, has most likely had an experience with the linguistic relativity principle. The farther the language in question is different from the native language, the more the phenomenon is obvious.

Continue reading “Languages: linguistic relativity, words vs. thought”