In 2009 Stuart McMillen, famed Australian comic artist, published a drawn rendition of a short passage from Neil Postman’s Amusing Ourselves To Death. The passage compares the radically different worlds depicted by Orwell in his “1984” and by Aldous Huxley in his “Brave New World.” Both novels show an Earth whose inhabitants have been rendered helpless and brainwashed, and are considered the quintessential dystopian novels. The term Big Brother, after all, was coined by Orwell for his novel. Yet they depict a radically different approach to enslave humankind. I’ll leave you to the word of Postman and to the wonderful, if not a little spine-chilling, imagery of McMillen. What Orwell feared where those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one would want to read one. Orwell feared those who would deprive us of…
Category: Society
Many social networks today employ the concept of “favoriting” items: images on Flickr, Instagram and 500px; songs on SoundCloud; videos on YouTube and Vimeo; tweets on Twitter; repositories on GitHub; and so on…. When you “favorite” or “like” something, you’re essentially telling the author that you’d like more of that. Yet, when it comes to creative endeavors, it’s money that makes the difference: not only it helps cover the costs of production, but it also frees up time to produce more. That’s why many of us resort to selling prints, crowdfunding and other ways of raising money. One of such other ways is Flattr. And it’s G-R-E-A-T. The idea behind the Sweden-based company is both simple and genius: instead of actively sending money to an author, which can be complex and, in some cases, awkward, you can prepay your Flattr account using pretty much any credit card (in addition to Paypal)…
If you were to describe my country, Italy, as a country fearful of change, you wouldn’t be too far off from the truth. If Italians could live under a bubble preventing time from passing, most of them would jump at the opportunity. I have come to the conclusion that most of my fellow countrymen are luddite by nature. Technology is seen as something to be feared, rather than embraced. Something new comes along, and people of all ages — including part of the youth — will complain that it’s unnecessarily complicated, that things worked just as fine before, and that “back then” nobody was forced to learn anything new. I have wondered why people think this way for a few years now, and I think I’ve come to the conclusion that it has to do with history. Even today, a hundred and fifty-one years after the unification of the country,…
God here. First, I do not exist. The concept of a 13,700,000,000 year old being, capable of creating the entire Universe and its billions of galaxies, monitoring simultaneously the thoughts and actions of the 7 billion human beings on this planet is ludicrous. Grow a brain. Second, if I did, I would have left you a book a little more consistent, timeless and independently verifiable than the collection of Iron Age Middle Eastern mythology you call the Bible. Hell, I bet you cannot tell me one thing about any of its authors, their credibility or their possible ulterior motives, yet you cite them for the most extraordinary of claims. Thirdly, when I sent my “son” (whatever that means, given that I am god and do not mate) to Earth, he would have visited the Chinese, Japanese, Europeans, Russians, sub-Saharan Africans, Australian Aboriginals, Mongolians, Polynesians, Micronesians, Indonesians and native Americans, not…
Steve is gone. I waited a few days before writing about it, yet it still feels unreal. The man who created Apple, the company that essentially created modern computing and more, is gone. It was widely known that he was sick, and those who follow Apple-related news and rumors had a feeling that it was only a matter of time when he decided to step down as Apple CEO in late August. He had said that, should his health not allow him keep that position any longer, he would do so; and a month a half ago, he did. We knew it was coming, there’s no denying that. The forum posts at that time were ripe of surreal optimism, as if we were all thinking the same thing but refused to speak it out loud: he may not be CEO, but he’s still in the board, so he will still…
It is no mystery that I have a passion for photography. Having published two books and posting regularly on my Flickr stream, and knowing the theory of optics in addition to just snapping around, I think I know what I’m doing. Mind you, this does not mean I consider myself an artist. It may sound cliché, but I am strongly convinced that artist is a definition that others should cast upon you, rather that something you call yourself. In fact, despite what I am often told, I do not feel like my photography is that good. It’s not false modesty: I really don’t think so.
However, ever since the introduction of cheap compact cameras (and, god forbid, cheap reflex cameras), photography became mainstream. There is nothing inherently wrong with it – the more the merrier, right? – yet there are some trends in photography that I simply do not understand, and some that are just plain bad. Needless to say, these annoyances are most often perpetrated by hipsters or (gasp!) wannabe hipsters. Now, it has to be clarified that my concept of hipster includes not just the traditional, American-ish hipster, but more generally all those “subcultures” – trust me, quotes were never more appropriate – that strive to be alternative and ultimately fail to be unique. This includes, admittedly due to my cultural vantage point, the decadent leit-motif that seems to permeate the life of Italian teenage girls and young women. I may write specifically about this matter, as it’s not specific to photography.
So, without further ado, let me present a roundup of the most annoying trends in photography today. It goes without saying that this is merely my personal opinion.
There is one reason computers are great at numbers and awful at languages: the latter are difficult to parse. While complex mathematical operations can be carried out in a well-known order, parsing text can be exruciating difficult even for humans.
This is especially true for languages — such as English — that allow long sequences of words to be joined together without prepositions, and that use the same word both as a noun and as a verb.
In my last post, I wrote about the connections between language and thought, ie. linguistic relativity / determinism.
In today’s highly globalized world, languages get mixed and evolve at a much faster pace than ever before. English, for instance, is no longer only divided into British, American, Canadian and Australian English; we could say that there is a variety or dialect of English for any other natural language: Spanglish, Chinglish and so on. When French was the de-facto lingua franca of diplomacy (and, by extension, Western Europe), it was not substantially modified by other local languages; yet when English replaced it, after World War I and especially after World War II, it started changing immediately.
English, in particular its American variety, was not only originally used for international diplomacy; rather, as the United States rose a superpower in many fields (technology, business, etc.), one could argue that its language became widespread from the bottom. The average Joe in most other Western countries was exposed to American words: they wore blue jeans, they put coins into juke-boxes, they went to a bar. English words became commonplace over time, and this ultimately led to the creation of what could be easily considered a series of creoles that are, for the most part, mutually intelligible.
One of the most intriguing concepts in linguistics is the so-called Sapir-Whorf hypothesis, or linguistic relativity principle. Simply put, it states that the language we speak can influence the way we think. Another common name for this theory is linguistic determinism. There are some subtleties in the usage of these different names (no pun intended), but in order to avoid confusing them and giving wrong information, I’ll refrain from attempting. There are many resources online about the details of this topic for those who wish to delve deeper. For the sake of this post, I will freely use the terms interchangeably.
Anybody who studied a foreign language, even without reaching fluency, has most likely had an experience with the linguistic relativity principle. The farther the language in question is different from the native language, the more the phenomenon is obvious.
This is spot on. You’ll need this if you ever come here. 😀