OPLIN 4cast #433: This .sucks

April 15th, 2015

sucks stampBeginning a little more than a year ago, the Internet Corporation For Assigned Names and Numbers (ICANN) began implementing a plan to expand the number of generic top-level domains (gTLDs) on the Internet – domains like .com, .net, and .org – from 22 to (eventually) over 1,000. The first seven new gTLDs added in early 2014, for example, were .bike, .clothing, .guru, .holdings, .plumbing, .singles, and .ventures. One gTLD that’s about to be launched is causing concern: The .sucks gTLD is commanding very high prices, as big companies and celebrities pay top dollar to keep the general public from buying top-level domains like <celebrityname>.sucks. In retrospect, it’s hard to see how this gTLD could not have turned out badly. Fortunately, libraries do not suck.

  • Internet naming body moves to crack down on ‘.sucks’ (Associated Press)  “The Internet Corporation for Assigned Names and Numbers, or ICANN, on Thursday sent a letter to the U.S. Federal Trade Commission and Canada’s Office of Consumer Affairs to see if the actions of company Vox Populi Registry Ltd. are illegal. ICANN initially approved of the so-called top-level domain name, among nearly 600 it has added recently to expand beyond common names such as ‘.com,’ ‘.org’ and ‘.us.’ But it is backtracking after an advisory panel made up of industry groups and companies like Microsoft, Verizon and eBay complained last month.”
  • Internet naming group asks FTC to investigate .sucks controversy (Yahoo! Finance | Aaron Pressman)  “The Internet Corporation for Assigned Names and Numbers (ICANN), a non-profit that sets policies for the global domain name system, said in a letter to the FTC and Canada’s Office of Consumer Affairs that companies have complained the system is ‘predatory, exploitive and coercive.’ ICANN asked the regulatory bodies to determine whether any laws had been broken. ‘ICANN is concerned about the contentions of illicit actions being expressed, but notes that ICANN has limited expertise or authority to determine the legality of Vox Populi’s positions, which we believe would fall in your respective regulatory regimes,’ the group said in the letter.”
  • “.sucks” registrations begin soon—at up to $2,500 per domain (Ars Technica | Lee Hutchinson)  “The pricing situation around .sucks domain names is complicated. Companies with registered trademarks will have to pay an astounding $2,499 to register their trademarked names in .sucks. Registration of non-trademarked names during the ‘sunrise’ period (March 30 until June 1) before .sucks goes live will cost at least $299 per name, while the standard registration fee after June 1 goes to at least $249 per name. Companies are typically hyper-sensitive about brand usage, and few will want their .sucks domains under someone else’s control.”
  • New .sucks domain stirs up storm over free speech (The Star Online)  “John Berard, chief executive at Vox Populi, told AFP the new domain is something that companies can use to engage with consumers, and that he sees the word ‘sucks’ as ‘edgy’ but not pejorative. ‘We think we’re creating an opportunity for interaction that is meaningful,’ he said. ‘If a company were to establish its own .sucks site and drive that discussion to a centralized location it might be quite a valuable asset.’ Berard added that the pricing ‘reflects what we believe to be the value of the names.’”

Articles from Ohio Web Library:

OPLIN 4cast #432: You want privacy?

April 8th, 2015

keyholeJust about everybody would like to feel that the things they do on the Internet remain private, at least sometimes. In reality, many of us, according to the Pew folks, expect that someone in the government will be snooping on us, or we assume that someday, some hacker will break into a company’s data and get information about us. But we also assume that when a company says it will keep our data private, it will at least try to protect our privacy. Judging by a couple of recent news stories, that does not always happen.

  • Report: Facebook tracks all visitors, even if you’re not a user and opted out (Ars Technica | Glyn Moody)  “This newly found tracking, used to provide targeted advertising, is carried out through Facebook’s social widget, the Like Button. A cookie is placed in the browser when someone visits any page in the facebook.com domain, including sections that do not require an account. For visitors that are not Facebook users, the cookie contains a unique identifier, and it has an expiration date of two years. Facebook users receive additional cookies that identify them uniquely.”
  • Facebook ‘tracks all visitors, breaching EU law’ (The Guardian | Samuel Gibbs)  “EU privacy law states that prior consent must be given before issuing a cookie or performing tracking, unless it is necessary for either the networking required to connect to the service (‘criterion A’) or to deliver a service specifically requested by the user (‘criterion B’). The same law requires websites to notify users on their first visit to a site that it uses cookies, requesting consent to do so.”
  • RadioShack’s bankruptcy could give your customer data to the highest bidder (Bloomberg Business | Joshua Brustein)  “RadioShack’s customers—even those whose most recent purchase came years ago—could also find themselves sold off in the deal. The company included personal data in its bankruptcy auction as its own asset class. A website maintained by Hilco Streambank, which is serving as an intermediary for RadioShack, says that more than 13 million e-mail addresses and 65 million customer names and physical address files are for sale.”
  • How safe is your information when a company goes bankrupt? (Dallas Morning News | Michael A. Lindenberger)  “As with many bankrupt firms, that list would have been worth a fortune to the right buyer, especially if the data could be sold free of any obligation to keep it private. Like most large and reputable companies, RadioShack had promised customers not to sell the data it collected to third parties. But bankruptcy court is a unique setting in American law, and one of its chief purposes is to maximize the value of a bankrupt company’s assets even if it must sever otherwise valid contracts.”

Articles from Ohio Web Library:

OPLIN 4cast #431: Morphing antennas

April 1st, 2015

Wi-Fi routerEthertronics Inc. got some attention in the technology media about a month ago when they announced that they had developed a chip for wireless routers that changes the radiation pattern of the router’s antennas to fit the characteristics of a space. Ethertronics (and some other companies) have previously been doing work in the area of developing “reconfigurable” antennas for mobile phones and tablets, and at least one university is also doing research in this area, reportedly with military funding. Reconfigurable antennas in routers could improve wireless performance substantially and might alleviate the need for multiple wireless routers in some places.

  • Wi-Fi signals can now penetrate thick walls and cover long distances (Deccan Chronicle | Francis D’Sa)  “The chip enables Active Steering technique (signal steering) by monitoring the RF link performance and uses a closed group of predictive algorithms to select the best antenna radiation and pattern for superior performance. The technology works best with 802.11 ac devices and enables wireless signals to penetrate thick walls, ceilings and alike to reach further distances, where conventional routers cannot today.”
  • Active antennas could mean more powerful Wi-Fi networks (GigaOm | Kevin Fitchard)  “Ethertronics has designed the antenna technology that has gone into more than a billion mobile devices (if you own a Samsung Galaxy device chances are you’re talking and surfing through an Ethertronics rig), but its active steering technology hasn’t yet made it into a mobile device, though it is engaged in several trials with carriers. [Chief Scientist Jeff] Shamblin, however, thinks that that the technology stands a good shot in the Wi-Fi market as we increasingly hook more devices into home wireless networks from TVs and stereos, to wearables and smart home appliances.”
  • Ethertronics goes for WiFi (Electronics Weekly | David Manners)  “The technology allows three physical antenna to act as 12 virtual antenna. ‘We take one antenna and by changing the radiation pattern of the antenna we can generate multiple radiation patterns from a single antenna,’ [COO Vahid] Manian told EW.  Beam steering, by sampling and switching between the multiple radiation patterns, selects the antenna radiation pattern that provides the best RF link performance. This delivers, says the company, improvements in range, data throughput, interference reduction, robustness in multipath environments, and connection reliability.”
  • Wi-Fi beam-steering tech could kill off fixed home networks (Faultline [via The Register] | Peter White)  “The system also comes with some predictive algorithms so that once it has worked out which is the best way to deliver a particular signal, it gets better at finding the most effective radiation pattern more immediately. The company already sees future markets for active antennas in hospitals, inventory tracking, traffic control, car-to-car control, metering, cameras and Internet of Things sensors.”

Articles from Ohio Web Library:

OPLIN 4cast #430: A different approach to stopping piracy

March 25th, 2015

ICANN logoAny library that has a sizable number of patrons using the library’s public Internet computers has probably seen more than a few copyright infringement notices. (They’re often passed along to the library through OPLIN; we get them because we’re on file as the “owner” of most Ohio public library IP addresses.) It’s inevitable, given enough people, that someone in the library is going to download something that is supposed to be protected by copyright from downloading, and the companies that are paid by the movie and music industries to watch for illegal downloads then issue an infringement notice. Now the entertainment industries are suggesting a different technique and would pressure the Internet Corporation for Assigned Names and Numbers (ICANN) to unregister websites that the industries feel are distributing pirated materials.

  • MPAA, RIAA urge ICANN to do more about copyright infringement (Techspot | Shawn Knight)  “Back in 2013 when ICANN opened up top-level domains, it included a provision in its contract with the registries of the new top-level domains called Public Interest Commitments. Part of that agreement mandated that they’d only do business with domain name registrars that prohibited its customers from distributing piracy, trademark and copyright infringement in addition to a hose of other nefarious activities. With that, ICANN suddenly finds itself in a sticky situation. If they are strong-armed into policing such sites, how do they go about doing it?”
  • Hollywood asks domain registrars to censor the Web for intellectual property infringement (Electronic Frontier Foundation | Jeremy Malcolm and Maira Sutton)  “The report goes on to claim that registrars are required, under agreements with ICANN, to take action by locking or suspending domains when they receive a notice about one of their domains facilitating illegal activity. This isn’t true, and by claiming it is, USTR [United States Trade Representative] is here repeating the United States entertainment industry’s current talking points. What is true is that the RIAA and MPAA have recently asked ICANN, the multi-stakeholder body that administers the global domain name system, to establish that domain registrars must take down websites over copyright infringement complaints.”
  • RIAA escalates piracy witchhunt, points its pitchfork at domain registrars (Digital Trends | Chris Leo Palermino)  “The US government…agrees with the RIAA’s stance that domain registrars should be responsible for court-ordered takedown notices. The report specifically mentions one domain registrar, Canada-based Tucows, as one that hasn’t taken action when notified of its clients’ infringing activity. Furthermore, ‘some registrars even advertise to the online community that they will not take action against illicit activity,’ according to the report. Even if domain registrars did follow court orders, the RIAA wants more: they want the domain registrars to take sites down when someone reports copyright infringing content — not just when the court tells them to.”
  • ICANN, copyright infringement, and “the public interest” (The Washington Post | David Post)  “In a sense, it looks harmless enough; if you want to register frabulous.app, or washingtonpost.blog, or dewey-cheatem-and-howe.attorney, or any other 2d-level domain in these TLDs, what’s wrong with making you promise not to engage in ‘piracy’ or ‘fraud,’ or any activity ‘contrary to applicable law’? Who wouldn’t promise such a thing? It is, however, anything but harmless – and the RIAA/MPAA letters show why. Registries and registrars will henceforth have to satisfy ICANN that they are taking appropriate steps to suspend end-users who engage in ‘piracy’ or any activity ‘contrary to applicable law’; if they do not, they risk losing their place in the DNS.”

Articles from Ohio Web Library:

OPLIN 4cast #429: The tech blogging business

March 18th, 2015

typewriterWe read a lot of technology blogs every week, watching for things that might interest readers of this 4cast blog. Recently, a couple of the technology blogs themselves have become news, as ReadWrite was sold last month, and GigaOm announced last week that it was closing its business. Is this a trend? Possibly. The fact is, many mainstream news publishers now include a technology blog on their own websites, such as The New York Times’ Bits, the Wall Street Journal’s Digits, the Huffington Post’s HuffPost Tech, and any number of blogs associated with major local daily newspapers. It could be that we’re seeing the beginning of the end of the independent technology blog.

  • Gigaom shuts down in crowded tech media landscape (Wall Street Journal | Steven Perlberg)  “Gigaom’s fate had others in the digital media industry wondering if the event portended a larger ‘content shakeout,’ especially given that Say Media recently sold its own tech site ReadWrite. Executives in the media and venture capital worlds say it is quite difficult to be a self-sustaining site these days, considering the crowded tech media space. ‘When everybody from The Wall Street Journal to the New York Times to everybody else has doubled down on covering tech, what’s your value?’ said Rafat Ali, the CEO and founder of travel site Skift.”
  • Tech blog GigaOm abruptly shuts down (The New York Times | Ravi Somaiya)  “The site, long known for both its business and consumer-facing technology posts, had been open to experimentation in its business model. Like other media start-ups, Gigaom hosted a series of technology conferences that charged high prices for admission. The company offered a white-paper research business, and also sold advertising.”
  • Should we crowdfund ReadWrite? (ReadWrite | Owen Thomas)  “Our former publisher, Say Media, was prepared to shut us down if we hadn’t found a new owner. Wearable World is a good home for ReadWrite, and I’m energized by the conversations I have with our CEO, Redg Snodgrass, about our shared mission to democratize technology. Gigaom’s closure is a stark reminder, though, that the conventional advertising model, even when complemented by businesses like events and research, isn’t always enough to allow independent reporting about technology to thrive.”
  • Platishers, beware: Say Media gives up on publishing (Digiday | Lucia Moses)  “The latest pivot adds to the company’s already complicated story. It also serves as a warning to other digital natives like Vox Media, Business Insider and pretty much anyone with a custom CMS and a pile of VC cash that are trying to figure out how to be the media company of the future by marrying media and tech. It sounds good to say you’ll do both — and that’s probably what venture backers want to hear — but when it comes to execution, the demands of both businesses make a combination very difficult.”

Articles from Ohio Web Library:

OPLIN 4cast #428: Digital vs. print textbooks

March 11th, 2015

textbooksThere are conflicting signals coming of out colleges lately in regard to digital textbooks. Are they the wave of the future or are print textbooks better? There’s a considerable amount of money at stake here (I think we all know how crazy-expensive college textbooks can be), and textbook companies by and large seem to be placing their bets on digital and getting out of the print business. Students, on the other hand, don’t seem to be convinced, with a couple of studies indicating that they prefer print.

  • Should college textbooks go digital? (TechCocktail | Scott Huntington)  “In this digital age, we carry around much of the world’s collective knowledge and history in our pockets. So why are college students asked to pay thousands of dollars for heavy doorstops that are technologically on par with something out of the Bronze Age? This is especially true in fields like science and economics where the information is changing so often that books can be out-of-date before they even hit the shelves.”
  • The death of textbooks? (The Atlantic | Terrance F. Ross)  “Nostalgia aside, it may come as a relief to many, then, that textbooks are becoming anachronistic. Digital in-class learning materials, like software that adapts to the ways in which individual students acquire information, and other forms of virtual education content are becoming more effective and intelligent. College-affordability advocates and others hope this growth could result in the normalization of less costly or even free materials down the road.”
  • Why Chegg is abandoning a business worth over $200 million a year (Fast Company | Ainsley O’Connell)  “Over the next 18 months Chegg will liquidate its print inventory and refocus on its digital products, including self-guided homework help and on-demand tutoring. Students will continue to rent through Chegg’s platform, with the company taking a 20% take on the print textbooks and relying on Ingram to manage operations. The new strategy will widen the scope of Chegg’s digital operations in order to better serve student needs at a time when other companies in the higher education and professional training markets are looking to do the same.”
  • Students reject digital textbooks (Shinyshiny | Diane Shipley)  “Given how heavy textbooks can be, you’d think ebooks would be a huge bonus of being at university now, as opposed to ten years ago. And not only do they weigh less, you can buy or borrow them instantly instead of having to schlepp to the university bookshop or library, search within them easily, and highlight and make notes without being accused of defacing anything. So you’d think students would be all over digital textbooks. But they’re not.”

Articles from Ohio Web Library:

OPLIN 4cast #427: TV white spaces

March 4th, 2015

television setIn June 2009, television stations in the United States stopped broadcasting analog signals and switched to digital transmissions. This released large areas of broadcast frequencies between 50 MHz and 700 MHz that are not needed for digital TV, and are available for other uses. One possible use of this “white space” is for wireless broadband Internet access, using relatively inexpensive equipment to transmit Internet data over these frequencies rather than using a physical connection or cellular wireless. So does this technology have any value for libraries? Possibly. The Gigabit Libraries Network is currently leading a WhiteSpace Pilot project to demonstrate how TV white space “…can increase availability and convenience of Wi-Fi access at tens of thousands of new fixed and portable public library community hotspots.”

  • Microsoft-backed TV white spaces trial goes commercial in Ghana (ZDNet | Adam Oxford)  “TV white spaces, otherwise known as dynamic spectrum allocation, is seen as a promising form of connectivity for extending broadband networks to rural areas across the world – including parts of the US. It works on unlicensed areas of the radio frequency spectrum that are allocated for analogue TV channels, using gaps in the signal to carry internet traffic. Google, Facebook, and Microsoft have all run white spaces pilots in Africa, and it is considered a promising alternative for broadband access where building a commercial case for 4G or fibre is tough.”
  • White Space, the next internet disruption: 10 things to know (TechRepublic | Lyndsey Gilpin)  “Television networks leave gaps between channels for buffering purposes, and this space in the wireless spectrum is similar to what is used for 4G and so it can be used to deliver widespread broadband internet. Typical home Wi-Fi can travel through two walls. White Space broadband can travel up to 10 kilometers, through vegetation, buildings, and other obstacles. Tablets, phones, and computers can all access this wireless internet using White Space through fixed or portable power stations.”
  • TV white space will connect the internet of things (Wired UK | James Temperton)  “Uses for the technology currently being trialled include live video streaming of meerkats at London Zoo and sensor networks to provide flood warnings on the Thames and Cherwell rivers near Oxford. Trials have also been carried out to bring faster broadband connections to ships travelling near the Orkney Islands. The first commercial uses of the technology are expected by the end of 2015.”
  • Libraries to expand as TVWS hot-spots with new Knight project (CivSource | Bailey McCann)  “Phase two of the project – with the aid of Knight funding – will expand the role of libraries using TVWS. Participants will be encouraged to think of ways to use TVWS/WiFi for community disaster planning as a redundant and potentially community resource. Ideas to explore include how to use libraries as a headquarters during disasters or as pop-up hotspots around the community.”

Articles from Ohio Web Library:

OPLIN 4cast #426: The next version of HTTP

February 25th, 2015

Hypertext Transfer ProtocolLast week, the Internet Engineering Steering Group (IESG) posted a message stating that they had “…approved the following document: ‘Hypertext Transfer Protocol version 2’…as Proposed Standard.” While this sounds pretty innocuous and/or cryptic to many of us, this announcement actually marked the official beginning of HTTP/2, the long-awaited successor to HTTP/1.1, the information transfer protocol currently used by the World Wide Web (allowing the hypertext linking between web pages mentioned in last week’s 4cast). While that still may not mean much to most of us, we should all notice that the Web will respond a bit faster in the future than it does now, once HTTP/2 becomes the common standard.

  • The internet is about to get faster — here’s why (Business Insider | Peter Maynard)  “When a web page is requested, the server sends back the page, but must wait for the web browser to parse the page’s HTML and issue further requests for things it finds in the code, such as images. Server push allows the server to send all the resources associated with a page when the page is requested, without waiting. This will cut a lot of the latency associated with web connections. Once web servers and web browsers start implementing HTTP/2 – which could be as soon as a few weeks from now – the web-browsing experience will feel quicker and more responsive.”
  • Everything you need to know about HTTP2 (ReadWrite | Lauren Orsini)  “For the past 16 years, HTTP has basically done the heavy lifting of bringing Web pages to your browser. When you type a URL into your browser bar—readwrite.com, for instance—you’re actually creating an HTTP request to the Web server that instructs it to find and deliver a particular Web page. But HTTP has its limits. Modern Web pages pack in more features than just about anyone imagined back in 1999, making it more resource-intensive than ever just to load them in a browser.”
  • HTTP/2 finished, coming to browsers within weeks (Ars Technica | Peter Bright)  “In HTTP/2, multiple bidirectional streams are multiplexed over a single TCP connection. Each stream can carry a request/response pair, and multiple requests to a server can be made by using multiple streams. However, the streams are all independent; if one stream is slow, the HTTP/2 connection can still be used to transfer data belonging to other streams. Similarly, a client can request a large object and then a small object, and the response to the small object can be given before, or even during, the response to the large object.”
  • Don’t blame yourself for ignoring HTTP/2, the biggest HTTP update in years (VentureBeat | Cullen Macdonald)  “HTTP/2 is for sure going to add to the increase in rate of change for things on the Internet, but it’ll do it without being noticed. It will continue to be incorporated into more websites you visit, and browsers will more fully support the official spec. There won’t ever be an explosion of speed from your phone’s browser where you’ll ask yourself ‘oh! is today HTTP/2 day?!’”

Articles from Ohio Web Library:

OPLIN 4cast #425: Deep links

February 18th, 2015

hyperlink symbolOne of the fundamental building blocks of the World Wide Web has always been the hyperlink, the code that allows you to jump from web page to web page and back again. These links function because each web page has an address that makes it possible to find the page when wanted, and also makes it possible for search engines to take you to a specific page of information upon request. Mobile apps do not share this basic functionality of the Web, and now that mobile apps occupy a larger and larger portion of the time people spend on the Internet, there is more and more pressure to recreate the Web experience within the mobile experience. “Deep linking” between apps may be one answer.

  • Apps everywhere, but no unifying link (NY Times Technology | Conor Dougherty)  “The app problem traces its origins to 2008, when Apple introduced the App Store for iPhones. Unlike websites, apps were set up to be separate little boxes whose technology prohibited them from interacting with one another. At the time, the idea of a phone full of apps was new enough that most people were not very worried about whether those apps could link to each other. But today, apps have begun to eclipse the web.”
  • Making apps as easy to search as the Web (Re/code | Ina Fried)  “To further complicate matters, it’s not only about the method for linking, it’s also about figuring out the right business opportunity. To search engines like Google that profess to catalog the world’s information, apps have been a threat — silos that need to be opened.”
  • Deep links, extensions turning apps into new mobile Web (MediaPost | Jeremy Shabtai)  “The general idea behind app deep linking is similar to hyperlinks found on the Web. An app is able to link directly to specific content buried within another app, allowing users to jump back-and-forth from one app to another. For example, Google Maps now has a link that takes you directly into the Uber app to call a car. Hitting the ‘back’ button actually navigates you back to the app page you just left. Mobile apps are finally starting to act like the Internet.”
  • Why deep linking matters (Taylor Davidson)  “Without the URL structure of the web, mobile app developers are forced to implement deep linking schemes to build back some of the basic functionality that the open web was originally built upon. Looking further, will deep linking recreate a complete URL structure for apps, or will it merely provide a short-term solution for mobile transactional ads and ecommerce? Even if the answer is only ‘merely’, it should unlock significant ecommerce mobile advertising spending and help lead to a much more diverse set of mobile advertising ad units and experiences.”

Articles from Ohio Web Library:

  1. Mobile deep linking: Should you care? (Folio: The Magazine for Magazine Management, March 2014, p19 | Much Speers)
  2. Facebook new ‘App Links’ mean no more switching from apps to Web browsers. (International Business Times, 5/3/2014)
  3. Branch announces funding for next generation deep linking technology. (PR Newswire US, 9/23/2014)

OPLIN 4cast #424: Trashing Flash

February 11th, 2015

Adobe Flash logoOnce upon a time, Adobe Flash was a feature of many websites and Internet ads that used animated graphics to catch the attention of site users. And quite a few Ohio public library websites still require Flash Player to view or listen to some of their site content. Now, however, a lot of websites are shying away from Flash, for a variety of reasons that have been summarized on the Occupy Flash website. In the past few weeks, there have been some serious security issues with Flash, and YouTube announced a move away from Flash technology to HTML5 for delivering Internet video, adding impetus to the calls for the demise of Flash. Here’s a sampling of some recent stories.

  • As Flash 0day exploits reach new level of meanness, what are users to do? (Ars Technica | Dan Goodin)  “The breakneck pace of the exploits is creating fatigue among end users, and one presumes, among engineers inside Adobe. No sooner is one patch rolled out than an exploit targeting a new vulnerability becomes available. What’s more, research from Cisco Systems found the recent Flash exploits were being served on more than 1,800 domains.”
  • Steve Jobs gets vindicated one last time (BGR | Brad Reed)  “Why is this a vindication for Steve Jobs, you ask? Because five years ago Jobs penned a long missive about Flash in which he explained why Adobe’s online video rendering technology had no place on iOS devices such as the iPhone and iPad. It wasn’t just one thing about Flash that Jobs didn’t like — it was everything. He found that Flash was far too power hungry for mobile devices, it didn’t deliver reliable performance and was prone to crashes, and it also had ‘one of the worst security records’ around back in 2010.”
  • Will YouTube HTML5 transition mean the end of Flash security issues? (TechTarget | Sharon Shea)  “Adobe released a further statement on the issue, touching on Flash’s significance and the company’s stance behind HTML5. ‘Flash is an important technology for media and content companies worldwide, with over 1.5 billion downloads and updates for the Flash Player every month,’ Adobe said. ‘At the same time, Adobe is a pioneer in the delivery of HTML5 development tools and a positive contributor to the HTML standard. Flash and HTML will continue to coexist and Adobe is committed to support and advancing both technologies.’”
  • Time to die: Let’s resolve to get rid of Flash already (ReadWrite | Yael Grauer)  “If you need Flash for work, or are addicted to DailyMotion, or can’t deal with Facebook and Amazon refreshing pages too slowly, another option is to use an extension like FlashBlock. This allows you to limit your Flash usage to the sites you select. While you’ll still be somewhat vulnerable if a popular site is infected with malicious advertising, it’ll lower your risk.”

Articles from Ohio Web Library: