via http://ift.tt/1nPMD2b Across the US, libraries are setting up maker labs as they turn themselves into hubs for high-tech innovation and training
via http://ift.tt/1nPMD2b Across the US, libraries are setting up maker labs as they turn themselves into hubs for high-tech innovation and training
via http://ift.tt/eA8V8J The number is in for how much Apple might have to pay to consumers to settle charges that it conspired with publishers to raise e-book prices: $400 million.
via http://ift.tt/1ys90yb Amazon’s Prime service is already perfect for binge watchers and shoppers. Now, Amazon wants to lure binge readers to the service, too.
via http://ift.tt/eA8V8J Real Simple: Between urgent work e-mails, status updates, tweets, and magazines, you read all the time, right? But when was the last time you lost yourself in a book? The […]
Want to understand what is going on in the world of webcomics and digital comics publishing? Here’s a book all about the current digital landscape, available as a crowdfunded preorder. You can also crowdfund a magical transgender TV series, a comic about an all-female space crew, and a chance to have your own superhero action figure.
If you love books as a physical object, just imagine burying your nose between the pages of these beauties, massive atlases, photobooks, and tributes to the written word.
via http://ift.tt/eA8V8J “Flying Around Book Ops” is a short video by photographer Nate Bolt featuring drone footage of the massive book sorting center in Queens, New York that provides material for the 150 branches of the New York and Brooklyn Public Libraries. The machine highlighted in the piece is the second largest in the world, sorting 33,000 […]
Brain Pickings takes 450+ hours a month to curate and edit across the different platforms, and remains banner-free. If it brings you any joy and inspiration, please consider a modest donation – it lets me know I’m doing something right.
Most books do not survive for more than a century, and we know very little about what people’s everyday reading experiences were like over the past 500 years. Luckily, says the British Library, we can look to classical paintings. There, we find images of people reading — and pictures of what typical books looked like.
via http://ift.tt/eA8V8J Photographer Bill Waldman captured this fantastic image of Cici James, musician and the proprietor of Singularity & Co in Brooklyn, who was made up in body paint by artist Adam DuShole to blend in with the books on a shelf in her store. A couple of months ago I was shooting my friend Cici at […]
“Books do not live essentially because they prove something or promote something else; they live because they give pleasure to the reader.” At Alleycat Books on 3036 24th Street.
A neutral Internet—one where Internet service providers (ISPs) can’t unfairly limit our access to parts of the Net, create special fast lanes for some services, or otherwise handle data in non-neutral ways—will require more than just rules that prohibit bad conduct. We’re also going to need real transparency.
Transparency is the crucial first step toward meaningful network neutrality. Without a detailed and substantive window into how providers are managing their networks, users will be unable to determine the reason why some webpages are slow to load. New services that hope to reach those users will have a harder time figuring out if there is some artificial barrier in place, and competitors won’t know whether and how they can offer better options (assuming some kind of competitive environment exists).
Fortunately, the FCC realizes how important transparency will be in ensuring a neutral Net. A key section of the network neutrality proposal released by the FCC last month asks for comments on how the agency should require Internet providers to disclose how they manage traffic over their networks. Here are some initial thoughts.Today, we’re in the dark
The FCC’s current transparency requirements are too vague to catch most of the harms of non-neutral behavior. At the moment the only thing an ISP has to do to be “transparent” by FCC standards is “publicly disclose accurate information regarding the network management practices, performance, and commercial terms of its broadband Internet access services.”
For most Internet providers this means a quick paragraph or two on their website describing at a very high level how they deal with congestion, and perhaps some statistics about how close their advertised speeds are to the true speeds users experience.
In order to generate these statistics, many of the largest ISPs take part in an FCC study called Measuring Broadband America. This ongoing study uses third-party white boxes (router-like devices that users plug into their home Internet connections) distributed to volunteers across the country to measure broadband speeds. The study averages data about download and upload speed and latency over the period of a month. (Latency is the time it takes for a packet of data to travel from one point on the network to another.)
Unfortunately, Measuring Broadband America, in its current form, can’t detect most of the harms of non-neutral network practices. That’s because most of its tests only measure the speed of a connection to artificial testing servers, not connections to popular websites that people normally access in the course of their browsing. Current testing would never capture, for example, the recent problems with slow Netflix download speeds for Comcast and Verizon subscribers.
The only current test that does measure how long it takes to access popular websites isn’t very rigorous and is limited to webpage loading time, not capturing other essential factors that indicate forms of ISP misbehavior, like application-specific traffic discrimination or content modification.We need more sunshine
If the FCC plans to issue net neutrality rules that actually make a difference, the agency needs to expand on its transparency requirements and demand that ISPs disclose more details about the management of their networks.
More specifically, in addition to measuring download and upload speed and latency, ISPs should also disclose statistics on jitter, uptime, packet loss, and packet corruption, among other details. Here’s what those terms mean:
- Jitter is the variability in the latency of packets, i.e., how much the delay between a packet being sent from its source and being received at its destination changes over time. Low jitter is important for applications like VoIP and video-chat, because if packets take different lengths of time to travel, the resulting audio or video stream can appear jumpy.
- Uptime is the percentage of time a user’s Internet connection is actually available. Uptime is important because even if your connection is ridiculously fast, it’s not very useful if it’s down most of the time.
- Packet loss is the percentage of packets that never make it to their destination, usually as a result of being dropped due to congestion.
- Packet corruption is the percentage of packets that are corrupted while in transit.
This data also needs to be reported in a more granular form than it is currently. Right now ISPs only report one-month averages, released every six months. We need data on an ongoing basis so that users and the FCC can catch harmful changes to ISP network management procedures more quickly.
Transparency will also require ISPs to do more than just test against their own servers. We know all too well that ISPs can offer wildly different qualities of service depending on their peering arrangements. And in a growing number of cases, websites are paying ISPs directly, instead of a web backbone company for interconnection, like the deal struck between Comcast and Netflix, for example.
Reports of network quality need to capture the experience the customer will get when talking to a large set of end points that are (1) well-connected to the Internet backbone and (2) unwilling or unable to pay ISPs for special peering arrangements. In other words, we need to know the kinds of service received by companies that have special peering or interconnection deals, as well as what type of service ISPs give to that startups that cannot afford special deals.
For instance, if an ISP hosts its own material or its own services, performance metrics for those services should be tabulated separately from those for servers hosted in unaffiliated data centers. Expanding testing this way will capture any discriminatory tiers that ISPs are implementing in their peering, hosting and content delivery network arrangements. The FCC should require the disclosure of a range of statistics about these metrics, as well as their average values.Levels of detail
Ideally all of this reporting would take the form of a cumulative distribution function, a graph which would allow endpoint service providers and consumer watchdogs to estimate the worst network problems consumers would experience 1% of the time, 5% of the time, etc., so that the public can get a sense of how variable they should expect their service to be.
In addition to reporting how often various levels of service are achieved, these statistics should also be reported as a function of what percentage of subscribers achieved those statistics on average (i.e. what percentage of individual customers had average speeds, latencies, uptimes, etc. at a given value) so that regulators can verify that ISPs are providing the same level of service to all of their customers.
Finally, although not strictly a transparency requirement, we believe that consumer watchdogs should begin testing ISPs for other forms of non-neutral behavior, specifically application blocking, throttling, and content modification. This sort of discrimination can be just as damaging as unfair peering or interconnection agreements, and we will need to be on guard that ISPs do not attempt to skirt any net neutrality rules this way.Transparency must be part of the rules
Of course, disclosure alone is not enough to protect the promise of an open Internet. But transparency—when properly implemented—can be a powerful tool.
To make the tool effective, transparency rules must result in information that can be used by both experts and everyday users. And user-facing transparency shouldn’t be shallow, even if in layman’s terms.
You can help make that a reality. It’s time to contact the FCC and send a clear message: It’s our Internet, and one important way to protect it is real, current, and meaningful information about the practices that are shaping our user experience. Visit DearFCC.org and take action now.
Share this: || Join EFF
Apps that force kids to log book time as a way to earn Internet and TV access are a huge mistake.
Recently, while cooling my heels at the airport, I overheard a boy of about 6 begging his mom to let him play with the family iPad. “No screen time until you do an hour of reading first,” was her reply. The child flung himself back in his seat and opened a paperback book with a disgruntled sigh.
I winced. Of course parents need to supervise their kids’ use of digital devices and the Internet. God only knows, plenty of adults have a hard enough time managing their own screen time, including people with a fundamental investment in literary culture, like novelists Zadie Smith (who uses the Internet-blocking software Freedom) and Jonathan Franzen (who has disabled the Ethernet portal on his writing computer). The American Academy of Pediatrics recommends no exposure at all to television, computers and cellphones for children under the age of 2 and suggests that older children’s “total entertainment screen time” be limited to one or two hours per day.
Most parents are also acutely aware of the importance of good reading skills to their children’s academic future. If they’re particularly well-informed, they’re aware that a recent report from Common Sense Media indicates the number of children aged 8 to 17 reading for pleasure has dropped significantly in the past few years. Digital media is frequently blamed for distracting kids from books, and so perhaps it’s not surprising that some parents have gotten the idea of using screen time as an incentive for page time.
The site Reading-rewards.com, for example, was set up by parents who decided to “put a system in place whereby their kids had to earn TV or game console time by reading: 1 minute TV time for every minute of reading.” Recently, FreeTime, a user-profile control app on the Kindle Fire, introduced a new setting by which parents can require their kids to spend a certain amount of time reading e-books before they can access the device’s games. In essence, these are digital versions of a clever “game token” allowance system created by a contributor to the Instructables website; her kids earn handmade chips good for computer time by doing chores around the house.
But there’s the rub: Reading should not be a chore. Chores are tasks that nobody wants to do but that have to be done all the same. Life is full of such activities. Part of being an adult is learning to suck it up and take care of them, yet another thing parents have to teach their kids. Kids often have to be bribed to do this with an allowance or game tokens or some other treat because kids aren’t big on the long view. They don’t care that if they don’t wash the dishes tonight; there will be no clean ones to eat off of tomorrow because tomorrow seems so irrelevantly far away.
To make an hour spent with a book into the equivalent of loading the dishwasher is to send the strong, implicit message that reading is a similar task, one that will never be a source of pleasure. You may end up with kids who have logged in lots of hours of reading, but that won’t make readers out of them. There’s a vast difference between dutiful, grudging, joyless reading and the kind of hungry, engaged reading that makes for a good student and a thoughtful citizen. It’s hard to be good at something you don’t enjoy.
The FreeTime read-for-play control makes this bad idea even worse by shucking the enforcement of it off onto a mindless bot. It reinforces the idea that reading is the intellectual equivalent of the spinach you have to eat in order to get dessert, and it suggests that the whole transaction is so tedious your parents can’t even be bothered to enforce it in person. Is it any wonder, then, that reading rates drop precipitously once kids enter their teens and begin to scrutinize the double standards of the adults in their lives?
There’s abundant research indicating that the primary way children learn to love reading is by growing up with adults who frequently choose to read for pleasure. Having plenty of books around the house is another contributing factor. Just as important, though less discussed, is making an effort to help kids find books that appeal to them. Even parents who like to read can be discouraged when their own childhood favorites don’t win over their offspring. However, children are just like adults; each one is an individual with his or her own particular taste, and helping them find the books that speak to those tastes is the major part of improving their reading skills.
There are some digital tools that might help with this, such as Wandoo Planet’s “interest tree” generator, designed to coax out a child’s preferences and provide some tailored recommendations. Book subscription services, like Epic, the kids’ equivalent of Oyster and Scribd, can provide a broad selection of e-books to browse through. But the best guide will almost always be an observant and helpful adult: a teacher, a librarian, a children’s bookseller — in other words, the kind of person who has devoted her whole life to helping children fall in love with books and who appreciates just how personalized the process should be. (That’s who did it for me — thanks again, Mrs. Belden!) The way to get kids to invest their time in reading is to be willing to invest some time and energy in it ourselves.
via http://ift.tt/eA8V8J PaperLater is a service by the newspaper printing service Newspaper Club that turns writing on the web into printed newspapers that are then mailed to users. All users need to do is hit a “Save to PaperLater” button once they’re part of the service, and when enough articles have been added it can be printed […]
Fair use enjoyed a major victory in court today. In Authors Guild v. HathiTrust, the Second Circuit Court of Appeals handed down a decision that strongly underscores a fair use justification for a major book scanning program. For those counting along at home, today’s decision marks another in a serious streak of judicial findings of fair use for mass book digitization, including Authors Guild v. Google, Cambridge University Press v. Becker, and the district court opinion in the HathiTrust case itself.
Given that consistent fair use record for book digitization, today’s ruling might not be totally surprising. Still, the text of the opinion is encouraging, and reflects a court that respects the Constitutional purpose of copyright as a tool to promote the progress of science and the useful arts—not a blunt instrument for rightsholders to regulate all downstream uses.
HathiTrust was set up by several research universities to operate a digital library containing electronic scans of the universities’ books (Google provided the scans as part of its Google Books project). The Authors Guild took issue with three practices that HathiTrust engages in: a full-text database that returns the book name and page number for matching search results; a service to make text available in formats accessible to print-disabled people; and a long-term archive to preserve books that might become unavailable during the term of their copyright restrictions.
With respect to the full-text database, the court found that although a copy of the entire work is made, the purpose of a full-text searchable database is so different from that of the underlying works that the use must be considered transformative. In fact, the court wrote, “the creation of a full‐text searchable database is a quintessentially transformative use”.
The Authors Guild also argued that HathiTrust’s use of an identical server and two tape back-ups constituted “excessive” copying. Thankfully, the court rejected that premise, acknowledging that when it comes to digital technology, an approach that focuses only on individual copies made is insufficient.
The court then looked at the Authors Guild’s “lost sale” argument—that its authors could have instead licensed their texts for paid inclusion in the database—and found it unconvincing. Fair use analysis requires a look at the harm in the marketplace that a use might create, but the court clarified that it only addresses economic harm that comes from a use serving as a substitute. After all, even if a scathing book review causes economic harm to a new book, the quotes it incorporates are no less fair use.
That means that the market harm argument can’t be used against a highly transformative use like a searchable database. As the court put it, “any economic ‘harm’ caused by transformative uses does not count because such uses, by definition, do not serve as substitutes for the original work.”
Turning to the accessibility features that HathiTrust offers, the court adopted a slightly different analysis, but continued to find for fair use. The use is not as fundamentally transformative, because the purpose of making a text available for reading or listening is unchanged.
Still, the court decided that the use is a fair one, citing the legislative history of the Copyright Act and the Americans with Disabilities Act. That robust view of copyright exceptions like fair use is great to see from the federal judiciary, and consistent with the international norms we’ve seen advanced in agreements like last year’s Marrakesh Treaty for the Blind.
Finally, the court remanded the issues on the last program, the long-term preservation of books. Importantly, it remanded on standing—meaning it’s not clear that the Authors Guild even has grounds to raise this complaint. The district court is left to determine whether Authors Guild can demonstrate that its copyrighted works are even affected.
That standing question raises a larger issue that has percolated through this series of high-profile legal losses for the Authors Guild. The Guild claims to represent the interests of all or most authors, but it has increasingly taken on expensive losing battles against technologies that would make texts more accessible. That’s a good argument for new groups like the Authors Alliance, which represent authors who are primarily concerned with being read.
With today’s fair use ruling, mass book scanning projects are on firmer ground than ever to continue under the protection of fair use. With that legal hurdle cleared, these services can hope to deliver what one earlier judge referred to as an “invaluable contribution to the progress of science and cultivation of the arts.”
Share this: || Join EFF