Thursday, April 28, 2016
You Are Quoting Shakespeare ↦
Rob Brydon demonstrates just how many common (and some less common) phrases in the popular vernacular come from Shakespeare:
Thursday, April 28, 2016
Rob Brydon demonstrates just how many common (and some less common) phrases in the popular vernacular come from Shakespeare:
Monday, April 25, 2016
Intel, the world’s largest computer chip maker, is still doing alright. In its most recent earnings announcement, for the first quarter of 2016, the company announced profits of $2 billion. Unfortunately, the results were paired with a quite different announcement: that Intel would be laying off up to 12,000 employees in order to “to accelerate its evolution from a PC company to one that powers the cloud and billions of smart, connected computing devices.”
Hidden in that language is an admission that Intel doesn’t already dominate the market of “smart, connected computing devices” the way it does dominate the PC CPU market. Timothy Lee of Vox highlights one bet that Intel made about 10 years ago that explains why:
Intel turned down an opportunity to provide the processor for the iPhone, believing that Apple was unlikely to sell enough of them to justify the development costs.
Oops.
Really, though, that bet was just the logical outcome of an even earlier bet that has cost Intel dearly in the era of smartphones. Intel’s mobile chip line—Atom—hasn’t gained much traction in the smartphone market where it faces stiff competition from ARM processors. ARM has been designing low-power processors for decades and licensing its designs to chipmakers. In fact, Intel owned an ARM chip making business called XScale but sold it in 2006:
Intel sold XScale because it wanted to double down on the x86 architecture that had made it so successful. Intel was working on a low-power version of x86 chips called Atom, and it believed that selling ARM chips would signal a lack of commitment to the Atom platform.
It’s important to note that Intel didn’t fail to see a mobile revolution coming. They just bet that the revolution would be powered by x86 chips. But x86’s unassailable dominance in the PC market hasn’t helped it in the mobile market, because smartphones came with entirely new platforms (iOS and Android) that made it easy to choose ARM over x86. Intel decided to put all of its eggs in the x86 basket instead of continuing to hedge its bets with XScale/ARM. Now it is paying the price for that decision.
Monday, April 25, 2016
In an article published in The Psychological Bulletin, psychologists A. Kluger and A. Denisi report completion of a meta-analysis of 607 studies of performance evaluations and concluded that at least 30 percent of the performance reviews ended up in decreased employee performance.
Yup.
Thursday, April 21, 2016
Brent Simmons weighs in on Swift’s performance as a programming language:
Maybe Swift is faster than Objective-C, or will be. But that also hardly matters.
[…]
Instead of language performance the focus should be on developer productivity. Developer performance. That’s the thing that counts these days.
Of course, that’s the argument that proponents of interpreted languages have been using for some time now.
But I think Swift otherwise gives and takes away — we get type inference and all the lovely things I mentioned before, but we end up fighting the type system, standing on our heads to deal with optionals, and working with a language that’s much larger (demonstrably) than is needed for writing great apps. It solves a whole bunch of problems that didn’t need solving (for app-writing).
I tend to agree. It seems there’s been a trend lately—embodied by Swift, Rust, and Go—towards languages whose typing system is strong, static, and inferred (where possible). The implicit promise is that you can have the best of both worlds: the “safety” of strong, static typing combined with the simplicity of not always having to declare types. But strong, static typing is still what it has always been: fiddly. It significantly reduces flexibility for the sake of catching certain classes of programming errors at compile time1.
Marco Arment adds his thoughts:
I don’t know much Swift yet. But I’ve felt since its introduction that while it seems like a good language overall, it feels more like a language designed by C++ enthusiasts to replace C++, rather than being particularly optimized for 99% of what it’ll really be used for: making high-level mobile and PC apps.
[…]
The idea of one language to serve all roles, high-level to low-level, is an interesting thought challenge, but I don’t think it could exist.
It’s worth noting that not all of the “errors” that strong, static typing catches at compile time are actually errors. Swift’s optionals represent an implicit admission of this by allowing developers to use nil
in variables that would otherwise require a value of a different type. But there are plenty of other cases where type may not matter as much. Many math operations, for example, should be just as applicable to floating point numbers as they are to integers. ↩
Wednesday, April 20, 2016
The British government’s main electronic security advisory group, the CESG, published some password guidelines late last year. One guideline that was, perhaps, a little surprising is that they now recommend against requiring users to regularly change their passwords:
It’s one of those counter-intuitive security scenarios; the more often users are forced to change passwords, the greater the overall vulnerability to attack. What appeared to be a perfectly sensible, long-established piece of advice doesn’t, it turns out, stand up to a rigorous, whole-system analysis.
I could not possibly agree more. Requiring frequent password changes is the software equivalent of taking your shoes off at the airport: useless security theater, giving the appearance of increasing security while actually accomplishing the opposite.
Wednesday, April 20, 2016
HTML5’s <video>
element has long been mired in controversy. The specification is silent as to which video codecs browsers should support for use with that element. Apple championed the use of the H.264 codec, as many of their mobile devices, including iPods and iPhones, have long had hardware decoders for that codec. Consequently, Apple’s Safari and Mobile Safari browsers support H.264 exclusively. But H.264 is controlled by the MPEG Licensing Authority and there were concerns that the MPEG LA could start charging exorbitant licensing fees once the codec’s use was firmly established.
In 2010, Google acquired a company called On2 and with it the VP series of codecs they had developed. Google then announced that the VP codecs would be available royalty-free in perpetuity. Subsequently, both Google’s Chrome and Mozilla’s Firefox implemented support for the VP8 codec (and its successor, VP9).
Also in 2010, Microsoft announced that Internet Explorer would only support the H.264 codec. For the last six years, then, the landscape for the <video>
element has been bifurcated, with Safari and Internet Explorer (and it’s successor, Edge) supporting H.264 while Chrome and Firefox supported VP8/VP9.
Now, Microsoft has reversed their decision and has announced that they will support VP9 as well:
Starting with EdgeHTML 14.14291, the open-source WebM container format and the VP9 video and Opus audio codecs are supported in Microsoft Edge. These are available to websites that use Media Source Extensions (MSE) to adaptively stream video content. Windows Web Apps (built on the same Edge APIs) will also be able to use WebM streams containing VP9 and Opus. This change will be available in stable releases starting with the Windows 10 Anniversary Update.
Of course, mobile devices without a hardware decoder for VP9 will experience higher CPU usage and lower battery life when viewing VP9 content, so:
…we’ve put VP9 behind an experimental flag in Microsoft Edge, and have provided a default setting for it that automatically enables VP9 when hardware acceleration is detected. VP9 is not supported on Windows mobile SKUs at this time.
Friday, April 8, 2016
Tweet from Jochen Wolters:
Myth: Open offices result in massive collaboration.
Reality: 2 people loudly collaborate; 30 must wear headphones to get any work done.
Bingo.
Friday, April 8, 2016
Robin Rendle defends the use of webfonts against complaints that they are too slow, among other things:
All of these points lead to Adam’s vitriolic condemnation of web fonts as being lazy, useless things; they’re not worth the effort to implement or stress over because they have no value whatsoever. On this point I heartily disagree.
Although Adam does make some good points towards the end:
System fonts can be beautiful. Webfonts are not a requirement for great typography. I entirely agree with this sentiment and, in certain circumstances, system or “web-safe” fonts should be used instead. When we download a typeface that is almost identical to Georgia or Helvetica then there’s not much of an advantage that can be had from requesting a large font file.
With that said, I don’t believe that all of human experience can be elegantly communicated via Helvetica, Times, Georgia, or San Francisco. And when I read that “typography is not about aesthetics” then I sigh deeply, heavily and come to the conclusion that 1. yes it is and 2. aesthetics is a problem for the reader. The more ugliness that is pressed upon us, the more lazy we become. Beauty, legibility, subtlety, these are the qualities that are possible with the help of web fonts and without them we are left with a dismal landscape devoid of visual grace or wit.
I agree heartily with these sentiments, as you may surmise from my own use of webfonts. Oscar Wilde once wrote, “All art is quite useless.” It’s true; art is not useful in the same sense that a hammer is useful. But then, as Stephen Fry wrote in reference to Wilde, “It is the useless things that make life worth living[.]”
The concern about web page load times is a genuine one, and one that too few web developers seem to care about. But it’s too easy to blame it all on webfonts; that’s mere scapegoating. The download size of a web site shouldn’t be thought of in terms of specific resources like images and webfonts but rather in terms of a download size budget. Once a total budget has been settled on, all resources—images, webfonts, CSS, HTML, JavaScript, and whatever else—should be made to fit in that budget. Sometimes that requires tradeoffs, and sometimes webfonts may be sacrificed. But other resources can be sacrificed as well, such as images or even, dare I say, that bloated JavaScript framework that’s everybody’s excited about but takes forever to download and execute.
Thursday, March 31, 2016
It seems that Microsoft’s new-found love for Linux is even deeper than we thought:
We’re still trying to get the inside story on what Microsoft has done here, but what we’ve known for several months now is that the company has developed some Windows kernel components (lxcore.sys, lxss.sys, presumably standing for “Linux core” and “Linux subsystem,” respectively) that support the major Linux kernel APIs. These components are not GPLed and do not appear to contain Linux code themselves; instead, they implement the Linux kernel API using the native Windows NT API that the Windows kernel provides. Microsoft is calling this the “Windows Subsystem for Linux” (WSL).
A kernel API is one thing, but to be useful you need user mode applications. […] For WSL, however, Microsoft is turning to Canonical, creators of Ubuntu, for help. Canonical has provided a system image containing the Ubuntu versions of the various command-line tools that are typically found in a Linux distribution.
Our understanding is that these are not recompiled or ported versions of the programs (as are used in tools aiming to provide a Unix-like environment on Windows such as Cygwin) but instead unmodified programs. Microsoft is describing this in terms of providing a Linux-like command-line environment at the moment, but from what we can gather, there’s little fundamental restriction to this, potentially opening the door to running a wide range of Linux programs natively on Windows.
There is wide consensus in my Twitter feed that the best response for this announcement is, “It’s the year of Linux on the desktop!”
Wednesday, March 30, 2016
In stark contrast to the Roman and Greek cultures, there is another ancient Mediterranean culture about which we know comparatively little: the Etruscans. The Romans called them Tusci or sometimes Etrusci and it is from those Roman words that the region of Tuscany—which was the heartland of the Etruscans—derives its name.
One of the primary reasons that we know relatively little about the Etruscans is that very little of their language has survived. Most Etruscan writing appears to have been on perishable materials such as cloth books or wax tablets, which have not survived into the modern era. Complicating matters further is the fact that the Etruscan language, unlike its neighbors Latin and Greek, may not have been part of the Indo-European family of languages.
All of this makes a recent archaeological discovery in Italy all the more significant: a large stone, inscribed with Etruscan writing, has been discovered.
At a dig outside Florence, a group of researchers have unearthed a massive stone tablet, known as a stele, covered in Etruscan writing. The 500-pound stone is 4 feet high and was once part of a sacred temple display. But 2500 years ago it was torn down and used as a foundation stone in a much larger temple. Hidden away for thousands of years, the sandstone stab[sic] has been preserved remarkably well. Though it’s chipped, and possibly burned on one side, the stele contains 70 legible letters and punctuation marks. That makes it one of the longest examples of Etruscan writing known in the modern world.