OPLIN 4cast #430: A different approach to stopping piracy

March 25th, 2015

ICANN logoAny library that has a sizable number of patrons using the library’s public Internet computers has probably seen more than a few copyright infringement notices. (They’re often passed along to the library through OPLIN; we get them because we’re on file as the “owner” of most Ohio public library IP addresses.) It’s inevitable, given enough people, that someone in the library is going to download something that is supposed to be protected by copyright from downloading, and the companies that are paid by the movie and music industries to watch for illegal downloads then issue an infringement notice. Now the entertainment industries are suggesting a different technique and would pressure the Internet Corporation for Assigned Names and Numbers (ICANN) to unregister websites that the industries feel are distributing pirated materials.

  • MPAA, RIAA urge ICANN to do more about copyright infringement (Techspot | Shawn Knight)  “Back in 2013 when ICANN opened up top-level domains, it included a provision in its contract with the registries of the new top-level domains called Public Interest Commitments. Part of that agreement mandated that they’d only do business with domain name registrars that prohibited its customers from distributing piracy, trademark and copyright infringement in addition to a hose of other nefarious activities. With that, ICANN suddenly finds itself in a sticky situation. If they are strong-armed into policing such sites, how do they go about doing it?”
  • Hollywood asks domain registrars to censor the Web for intellectual property infringement (Electronic Frontier Foundation | Jeremy Malcolm and Maira Sutton)  “The report goes on to claim that registrars are required, under agreements with ICANN, to take action by locking or suspending domains when they receive a notice about one of their domains facilitating illegal activity. This isn’t true, and by claiming it is, USTR [United States Trade Representative] is here repeating the United States entertainment industry’s current talking points. What is true is that the RIAA and MPAA have recently asked ICANN, the multi-stakeholder body that administers the global domain name system, to establish that domain registrars must take down websites over copyright infringement complaints.”
  • RIAA escalates piracy witchhunt, points its pitchfork at domain registrars (Digital Trends | Chris Leo Palermino)  “The US government…agrees with the RIAA’s stance that domain registrars should be responsible for court-ordered takedown notices. The report specifically mentions one domain registrar, Canada-based Tucows, as one that hasn’t taken action when notified of its clients’ infringing activity. Furthermore, ‘some registrars even advertise to the online community that they will not take action against illicit activity,’ according to the report. Even if domain registrars did follow court orders, the RIAA wants more: they want the domain registrars to take sites down when someone reports copyright infringing content — not just when the court tells them to.”
  • ICANN, copyright infringement, and “the public interest” (The Washington Post | David Post)  “In a sense, it looks harmless enough; if you want to register frabulous.app, or washingtonpost.blog, or dewey-cheatem-and-howe.attorney, or any other 2d-level domain in these TLDs, what’s wrong with making you promise not to engage in ‘piracy’ or ‘fraud,’ or any activity ‘contrary to applicable law’? Who wouldn’t promise such a thing? It is, however, anything but harmless – and the RIAA/MPAA letters show why. Registries and registrars will henceforth have to satisfy ICANN that they are taking appropriate steps to suspend end-users who engage in ‘piracy’ or any activity ‘contrary to applicable law’; if they do not, they risk losing their place in the DNS.”

Articles from Ohio Web Library:

OPLIN 4cast #429: The tech blogging business

March 18th, 2015

typewriterWe read a lot of technology blogs every week, watching for things that might interest readers of this 4cast blog. Recently, a couple of the technology blogs themselves have become news, as ReadWrite was sold last month, and GigaOm announced last week that it was closing its business. Is this a trend? Possibly. The fact is, many mainstream news publishers now include a technology blog on their own websites, such as The New York Times’ Bits, the Wall Street Journal’s Digits, the Huffington Post’s HuffPost Tech, and any number of blogs associated with major local daily newspapers. It could be that we’re seeing the beginning of the end of the independent technology blog.

  • Gigaom shuts down in crowded tech media landscape (Wall Street Journal | Steven Perlberg)  “Gigaom’s fate had others in the digital media industry wondering if the event portended a larger ‘content shakeout,’ especially given that Say Media recently sold its own tech site ReadWrite. Executives in the media and venture capital worlds say it is quite difficult to be a self-sustaining site these days, considering the crowded tech media space. ‘When everybody from The Wall Street Journal to the New York Times to everybody else has doubled down on covering tech, what’s your value?’ said Rafat Ali, the CEO and founder of travel site Skift.”
  • Tech blog GigaOm abruptly shuts down (The New York Times | Ravi Somaiya)  “The site, long known for both its business and consumer-facing technology posts, had been open to experimentation in its business model. Like other media start-ups, Gigaom hosted a series of technology conferences that charged high prices for admission. The company offered a white-paper research business, and also sold advertising.”
  • Should we crowdfund ReadWrite? (ReadWrite | Owen Thomas)  “Our former publisher, Say Media, was prepared to shut us down if we hadn’t found a new owner. Wearable World is a good home for ReadWrite, and I’m energized by the conversations I have with our CEO, Redg Snodgrass, about our shared mission to democratize technology. Gigaom’s closure is a stark reminder, though, that the conventional advertising model, even when complemented by businesses like events and research, isn’t always enough to allow independent reporting about technology to thrive.”
  • Platishers, beware: Say Media gives up on publishing (Digiday | Lucia Moses)  “The latest pivot adds to the company’s already complicated story. It also serves as a warning to other digital natives like Vox Media, Business Insider and pretty much anyone with a custom CMS and a pile of VC cash that are trying to figure out how to be the media company of the future by marrying media and tech. It sounds good to say you’ll do both — and that’s probably what venture backers want to hear — but when it comes to execution, the demands of both businesses make a combination very difficult.”

Articles from Ohio Web Library:

OPLIN 4cast #428: Digital vs. print textbooks

March 11th, 2015

textbooksThere are conflicting signals coming of out colleges lately in regard to digital textbooks. Are they the wave of the future or are print textbooks better? There’s a considerable amount of money at stake here (I think we all know how crazy-expensive college textbooks can be), and textbook companies by and large seem to be placing their bets on digital and getting out of the print business. Students, on the other hand, don’t seem to be convinced, with a couple of studies indicating that they prefer print.

  • Should college textbooks go digital? (TechCocktail | Scott Huntington)  “In this digital age, we carry around much of the world’s collective knowledge and history in our pockets. So why are college students asked to pay thousands of dollars for heavy doorstops that are technologically on par with something out of the Bronze Age? This is especially true in fields like science and economics where the information is changing so often that books can be out-of-date before they even hit the shelves.”
  • The death of textbooks? (The Atlantic | Terrance F. Ross)  “Nostalgia aside, it may come as a relief to many, then, that textbooks are becoming anachronistic. Digital in-class learning materials, like software that adapts to the ways in which individual students acquire information, and other forms of virtual education content are becoming more effective and intelligent. College-affordability advocates and others hope this growth could result in the normalization of less costly or even free materials down the road.”
  • Why Chegg is abandoning a business worth over $200 million a year (Fast Company | Ainsley O’Connell)  “Over the next 18 months Chegg will liquidate its print inventory and refocus on its digital products, including self-guided homework help and on-demand tutoring. Students will continue to rent through Chegg’s platform, with the company taking a 20% take on the print textbooks and relying on Ingram to manage operations. The new strategy will widen the scope of Chegg’s digital operations in order to better serve student needs at a time when other companies in the higher education and professional training markets are looking to do the same.”
  • Students reject digital textbooks (Shinyshiny | Diane Shipley)  “Given how heavy textbooks can be, you’d think ebooks would be a huge bonus of being at university now, as opposed to ten years ago. And not only do they weigh less, you can buy or borrow them instantly instead of having to schlepp to the university bookshop or library, search within them easily, and highlight and make notes without being accused of defacing anything. So you’d think students would be all over digital textbooks. But they’re not.”

Articles from Ohio Web Library:

OPLIN 4cast #427: TV white spaces

March 4th, 2015

television setIn June 2009, television stations in the United States stopped broadcasting analog signals and switched to digital transmissions. This released large areas of broadcast frequencies between 50 MHz and 700 MHz that are not needed for digital TV, and are available for other uses. One possible use of this “white space” is for wireless broadband Internet access, using relatively inexpensive equipment to transmit Internet data over these frequencies rather than using a physical connection or cellular wireless. So does this technology have any value for libraries? Possibly. The Gigabit Libraries Network is currently leading a WhiteSpace Pilot project to demonstrate how TV white space “…can increase availability and convenience of Wi-Fi access at tens of thousands of new fixed and portable public library community hotspots.”

  • Microsoft-backed TV white spaces trial goes commercial in Ghana (ZDNet | Adam Oxford)  “TV white spaces, otherwise known as dynamic spectrum allocation, is seen as a promising form of connectivity for extending broadband networks to rural areas across the world – including parts of the US. It works on unlicensed areas of the radio frequency spectrum that are allocated for analogue TV channels, using gaps in the signal to carry internet traffic. Google, Facebook, and Microsoft have all run white spaces pilots in Africa, and it is considered a promising alternative for broadband access where building a commercial case for 4G or fibre is tough.”
  • White Space, the next internet disruption: 10 things to know (TechRepublic | Lyndsey Gilpin)  “Television networks leave gaps between channels for buffering purposes, and this space in the wireless spectrum is similar to what is used for 4G and so it can be used to deliver widespread broadband internet. Typical home Wi-Fi can travel through two walls. White Space broadband can travel up to 10 kilometers, through vegetation, buildings, and other obstacles. Tablets, phones, and computers can all access this wireless internet using White Space through fixed or portable power stations.”
  • TV white space will connect the internet of things (Wired UK | James Temperton)  “Uses for the technology currently being trialled include live video streaming of meerkats at London Zoo and sensor networks to provide flood warnings on the Thames and Cherwell rivers near Oxford. Trials have also been carried out to bring faster broadband connections to ships travelling near the Orkney Islands. The first commercial uses of the technology are expected by the end of 2015.”
  • Libraries to expand as TVWS hot-spots with new Knight project (CivSource | Bailey McCann)  “Phase two of the project – with the aid of Knight funding – will expand the role of libraries using TVWS. Participants will be encouraged to think of ways to use TVWS/WiFi for community disaster planning as a redundant and potentially community resource. Ideas to explore include how to use libraries as a headquarters during disasters or as pop-up hotspots around the community.”

Articles from Ohio Web Library:

OPLIN 4cast #426: The next version of HTTP

February 25th, 2015

Hypertext Transfer ProtocolLast week, the Internet Engineering Steering Group (IESG) posted a message stating that they had “…approved the following document: ‘Hypertext Transfer Protocol version 2’…as Proposed Standard.” While this sounds pretty innocuous and/or cryptic to many of us, this announcement actually marked the official beginning of HTTP/2, the long-awaited successor to HTTP/1.1, the information transfer protocol currently used by the World Wide Web (allowing the hypertext linking between web pages mentioned in last week’s 4cast). While that still may not mean much to most of us, we should all notice that the Web will respond a bit faster in the future than it does now, once HTTP/2 becomes the common standard.

  • The internet is about to get faster — here’s why (Business Insider | Peter Maynard)  “When a web page is requested, the server sends back the page, but must wait for the web browser to parse the page’s HTML and issue further requests for things it finds in the code, such as images. Server push allows the server to send all the resources associated with a page when the page is requested, without waiting. This will cut a lot of the latency associated with web connections. Once web servers and web browsers start implementing HTTP/2 – which could be as soon as a few weeks from now – the web-browsing experience will feel quicker and more responsive.”
  • Everything you need to know about HTTP2 (ReadWrite | Lauren Orsini)  “For the past 16 years, HTTP has basically done the heavy lifting of bringing Web pages to your browser. When you type a URL into your browser bar—readwrite.com, for instance—you’re actually creating an HTTP request to the Web server that instructs it to find and deliver a particular Web page. But HTTP has its limits. Modern Web pages pack in more features than just about anyone imagined back in 1999, making it more resource-intensive than ever just to load them in a browser.”
  • HTTP/2 finished, coming to browsers within weeks (Ars Technica | Peter Bright)  “In HTTP/2, multiple bidirectional streams are multiplexed over a single TCP connection. Each stream can carry a request/response pair, and multiple requests to a server can be made by using multiple streams. However, the streams are all independent; if one stream is slow, the HTTP/2 connection can still be used to transfer data belonging to other streams. Similarly, a client can request a large object and then a small object, and the response to the small object can be given before, or even during, the response to the large object.”
  • Don’t blame yourself for ignoring HTTP/2, the biggest HTTP update in years (VentureBeat | Cullen Macdonald)  “HTTP/2 is for sure going to add to the increase in rate of change for things on the Internet, but it’ll do it without being noticed. It will continue to be incorporated into more websites you visit, and browsers will more fully support the official spec. There won’t ever be an explosion of speed from your phone’s browser where you’ll ask yourself ‘oh! is today HTTP/2 day?!’”

Articles from Ohio Web Library:

OPLIN 4cast #425: Deep links

February 18th, 2015

hyperlink symbolOne of the fundamental building blocks of the World Wide Web has always been the hyperlink, the code that allows you to jump from web page to web page and back again. These links function because each web page has an address that makes it possible to find the page when wanted, and also makes it possible for search engines to take you to a specific page of information upon request. Mobile apps do not share this basic functionality of the Web, and now that mobile apps occupy a larger and larger portion of the time people spend on the Internet, there is more and more pressure to recreate the Web experience within the mobile experience. “Deep linking” between apps may be one answer.

  • Apps everywhere, but no unifying link (NY Times Technology | Conor Dougherty)  “The app problem traces its origins to 2008, when Apple introduced the App Store for iPhones. Unlike websites, apps were set up to be separate little boxes whose technology prohibited them from interacting with one another. At the time, the idea of a phone full of apps was new enough that most people were not very worried about whether those apps could link to each other. But today, apps have begun to eclipse the web.”
  • Making apps as easy to search as the Web (Re/code | Ina Fried)  “To further complicate matters, it’s not only about the method for linking, it’s also about figuring out the right business opportunity. To search engines like Google that profess to catalog the world’s information, apps have been a threat — silos that need to be opened.”
  • Deep links, extensions turning apps into new mobile Web (MediaPost | Jeremy Shabtai)  “The general idea behind app deep linking is similar to hyperlinks found on the Web. An app is able to link directly to specific content buried within another app, allowing users to jump back-and-forth from one app to another. For example, Google Maps now has a link that takes you directly into the Uber app to call a car. Hitting the ‘back’ button actually navigates you back to the app page you just left. Mobile apps are finally starting to act like the Internet.”
  • Why deep linking matters (Taylor Davidson)  “Without the URL structure of the web, mobile app developers are forced to implement deep linking schemes to build back some of the basic functionality that the open web was originally built upon. Looking further, will deep linking recreate a complete URL structure for apps, or will it merely provide a short-term solution for mobile transactional ads and ecommerce? Even if the answer is only ‘merely’, it should unlock significant ecommerce mobile advertising spending and help lead to a much more diverse set of mobile advertising ad units and experiences.”

Articles from Ohio Web Library:

  1. Mobile deep linking: Should you care? (Folio: The Magazine for Magazine Management, March 2014, p19 | Much Speers)
  2. Facebook new ‘App Links’ mean no more switching from apps to Web browsers. (International Business Times, 5/3/2014)
  3. Branch announces funding for next generation deep linking technology. (PR Newswire US, 9/23/2014)

OPLIN 4cast #424: Trashing Flash

February 11th, 2015

Adobe Flash logoOnce upon a time, Adobe Flash was a feature of many websites and Internet ads that used animated graphics to catch the attention of site users. And quite a few Ohio public library websites still require Flash Player to view or listen to some of their site content. Now, however, a lot of websites are shying away from Flash, for a variety of reasons that have been summarized on the Occupy Flash website. In the past few weeks, there have been some serious security issues with Flash, and YouTube announced a move away from Flash technology to HTML5 for delivering Internet video, adding impetus to the calls for the demise of Flash. Here’s a sampling of some recent stories.

  • As Flash 0day exploits reach new level of meanness, what are users to do? (Ars Technica | Dan Goodin)  “The breakneck pace of the exploits is creating fatigue among end users, and one presumes, among engineers inside Adobe. No sooner is one patch rolled out than an exploit targeting a new vulnerability becomes available. What’s more, research from Cisco Systems found the recent Flash exploits were being served on more than 1,800 domains.”
  • Steve Jobs gets vindicated one last time (BGR | Brad Reed)  “Why is this a vindication for Steve Jobs, you ask? Because five years ago Jobs penned a long missive about Flash in which he explained why Adobe’s online video rendering technology had no place on iOS devices such as the iPhone and iPad. It wasn’t just one thing about Flash that Jobs didn’t like — it was everything. He found that Flash was far too power hungry for mobile devices, it didn’t deliver reliable performance and was prone to crashes, and it also had ‘one of the worst security records’ around back in 2010.”
  • Will YouTube HTML5 transition mean the end of Flash security issues? (TechTarget | Sharon Shea)  “Adobe released a further statement on the issue, touching on Flash’s significance and the company’s stance behind HTML5. ‘Flash is an important technology for media and content companies worldwide, with over 1.5 billion downloads and updates for the Flash Player every month,’ Adobe said. ‘At the same time, Adobe is a pioneer in the delivery of HTML5 development tools and a positive contributor to the HTML standard. Flash and HTML will continue to coexist and Adobe is committed to support and advancing both technologies.’”
  • Time to die: Let’s resolve to get rid of Flash already (ReadWrite | Yael Grauer)  “If you need Flash for work, or are addicted to DailyMotion, or can’t deal with Facebook and Amazon refreshing pages too slowly, another option is to use an extension like FlashBlock. This allows you to limit your Flash usage to the sites you select. While you’ll still be somewhat vulnerable if a popular site is infected with malicious advertising, it’ll lower your risk.”

Articles from Ohio Web Library:

OPLIN 4cast #423: Power to the (wireless) people

February 4th, 2015

revolution fistSo you have public Wi-Fi in your library, and your users are thankful. But how about power for recharging their wireless devices? Do they have to carry a cord and sit by the wall so they can plug in? Restaurants and coffee shops, the other popular places for public Wi-Fi, are increasingly providing wireless recharging stations at their tables, so users can simply lay their smartphones on a charging pad and load up on power. But nothing new is ever simple, and there are competing technologies for wireless charging at the moment. Recent news may make the choice of charging equipment a little easier for everyone.

  • Alliance for Wireless Power merges with Power Matters Alliance to push wireless charging standard (FierceWireless | Mike Dano)  “The A4WP was founded in 2012 to push the Rezence-branded technology for magnetic resonance wireless charging, and it counts Broadcom, Intel, Qualcomm, Samsung and others as members. The PMA was founded in 2012 to push its inductive charging technology, and its backers include AT&T, Duracell, Powermat Technologies and Starbucks, which has added PMA-capable chargers into some of its coffee stores. The groups said they would now jointly push wireless charging due to the merger, but that both technologies would continue to be available so that members could use whichever made the most sense.”
  • Alliance for Wireless Power and Power Matters Alliance agree to merge (Chip Design | press release)  “Consumers will gain access to an exciting and enhanced battery charging and power management experience sooner across the full spectrum of devices in daily use. Mobile network operators and commercial and retail brands can commit to the necessary investment confident of stable, long-term evolution and management of innovative wireless charging technologies.”
  • Two rival groups pushing wireless charging declare peace (Wall Street Journal Digits | Don Clark)  “The two groups had already made signals that they were moving closer together. Meanwhile, the third group—called the Wireless Power Consortium—claims 200 members including Philips and Microsoft. It supports a standard called Qi (pronounced ‘chee’), which it says combines elements of both inductive and resonance technology. Most Windows Phone handsets, and some Android smartphones are currently Qi compatible.”
  • Key wireless charging groups A4WP, PMA agree to merge (CNET | Roger Cheng)  “The two groups believe the merger will close by the middle of 2015, and plan to chose a new name for the combined group. The WPC, meanwhile, said the merger wouldn’t have an effect on its efforts to bring wireless charging capabilities to the consumer. ‘The two groups are filling gaps with technology the other didn’t have, and they have been behind in rolling out commercial products,’ said John Perzow, vice president of market development for the WPC.”

Articles from Ohio Web Library:

OPLIN 4cast #422: Hollywood pirates

January 28th, 2015

pirate flagIt’s a little less than four weeks until the Oscar awards are handed out, so that means it’s time for…piracy! Posting pirated movies on the Internet, that is. Every year, the “screener” DVDs of nominated movies that are sent to Oscar voters find their way onto the web for illegal downloading. Screeners often are versions of the movie that have not yet been through the final processing to enhance the imaging, but are good enough for voters to rate the movie. But while movie piracy in general is on the increase, if takedown notices are any indication, a strange thing may be happening with screeners: Some of them may not be good enough for the discriminating pirate any more.

  • 95% of Oscar contenders leaked on pirate sites already (TorrentFreak | Ernesto Van Der Sar)  “What stands out immediately is how widely available the films are. Of all 2015 nominees, except documentary and foreign films, 34 of the 36 films (95%) are present on pirate sites. Only the animated feature film ‘Song of The Sea’ and best original song nominee ‘Glen Campbell: I’ll Be Me’ have yet to appear online. The films that are available don’t all come in perfect quality of course. ‘Beyond the Lights,’ for example, only leaked in a CAM (camcorded) version. Most, however, are available in relatively decent screener, DVDRip or comparable quality.”
  • The cat-and-mouse game between online pirates and the Academy Awards (Allvoices | Joe Kukura)  “The vulnerable spot in the Academy ecosystem does appear to be the academy’s own voters. The voters, many of whom employ a large personal staff because they are rich and famous Hollywood people, are either complacent, unaware or in on the job. The common perceptions is that stars’ personal staff are making the copies, though this has never been verified. But it’s not as if the Academy Awards voters’ homes are being broken into and their screener DVDs stolen. Voters’ DVDs are being copied on purpose by non-intruders.”
  • High number of Oscar screeners hit pirate sites (Consumerist | Chris Morran)  “It’s not known which roommate of which member of which guild stole his friend’s DVDs and shared them with the world, but TorrentFreak reports that nearly all of the above screeners (with the exception of The Hobbit and Big Hero 6) originated from the same source. Lending credence to the notion that this is some amateur who decided to share this content with the world, TorrentFreak says that the encoding of the files being shared via BitTorrent is ‘choppy,’ implying that it’s an inexperienced pirate who ripped these files.”
  • Pirating the 2015 Oscars: HD edition (The Message | Andy Baio)  “If you’re the first to release a highly-prized film in a high-quality release, you win bragging rights over every other group. A release that’s lower quality than one already leaked by someone else? Completely worthless. A cam isn’t great, but a telesync is better. A telecine is marginally better than a telesync, but a watermarked screener? Much, much better. But here’s the thing: screeners are stuck in the last decade. While we’re all streaming HD movies from iTunes or Netflix, the movie studios almost universally send screeners by mail on DVDs, which is forever stuck in low-resolution standard-definition quality.”

Articles from Ohio Web Library:

OPLIN 4cast #421: Inside the Dark Side

January 21st, 2015

lizardThree months ago we posted a 4cast about the availability of cheap Distributed Denial of Service (DDoS) attacks on the Internet, mentioning as well that OPLIN has a system in place to protect libraries from such attacks. Over the Christmas holidays, DDoS attacks on the Xbox Live and PlayStation networks got a lot of attention in the media, and shortly after that the “Lizard Squad” group that claimed credit for those attacks announced the availability of their own inexpensive DDoS service for hire. Now, thanks in large part to security researcher Brian Krebs, that service is falling apart and providing an interesting glimpse into the dark side of the Internet.

  • Lizard kids: A long trail of fail (Krebs on Security | Brian Krebs)  “The Lizard kids only ceased their attack against Sony’s Playstation and Microsoft’s Xbox Live networks last week after MegaUpload founder Kim Dotcom offered the group $300,000 worth of vouchers for his service in exchange for ending the assault. And in a development probably that shocks no one, the gang’s members cynically told Dailydot that both attacks were just elaborate commercials for and a run-up to this DDoS-for-hire offering. The group is advertising the new ‘booter service’ via its Twitter account, which has some 132,000+ followers. Subscriptions range from $5.99 per month for the ability to knock a target offline for 100 seconds at a time, to $129.99 monthly for DDoS attacks lasting more than eight hours.”
  • A hacked DDoS-on-demand site offers a look into mind of “booter” users (Ars Technica | Sean Gallagher)  “Things have not gone all that well for LizardSquad since the launch of LizardStresser. Shortly after the service—which uses a botnet of hacked home and institutional routers—was launched, members of LizardSquad started getting arrested. Last week the LizardStresser server was hacked, and its database was dumped and posted to Mega by the former operator of the darknet ‘doxing’ site Doxbin. As a result, the usernames and passwords of LizardSquad’s ‘customers,’ along with logs of the Internet addresses that had been attacked by the router botnet, were laid bare for everyone to see.”
  • Xbox Live destroyers Lizard Squad facing backlash in underground hacker wars (Forbes | Thomas Fox-Brewster)  “Investigative journalist Brian Krebs broke the news that the Lizard Stresser Distributed Denial of Service (DDoS) offering, which lets people pay for website takedowns and which the Christmas attacks were supposed to advertise, was breached and the customer database leaked. Forbes has obtained a copy of what appears to be a leaked Lizard Stresser database, though it differs from the one Krebs posted a screenshot of (incredibly, Lizard Squad has been making DMCA requests for links to the leaks to be taken down from Kim Dotcom’s Mega storage service). The link came courtesy of one of the more talkative dark web denizens who goes by the name of ‘nachash’, who once ran the controversial Doxbin site, where personal details of select individuals were posted on the anonymising Tor network.”
  • Lizard Squad’s LizardStresser hacked and customer details made public (The Guardian | Stuart Dredge)  “The news follows several arrests made as police investigate the original PlayStation Network and Xbox Live attacks. On 31 December, a 22 year-old man from Twickenham was arrested by the South East Regional Organised Crime Unit (SEROCU) on suspicion of fraud by false representation and Computer Misuse Act offences, before being released on bail until 10 March. Then, on 16 January, an 18 year-old man was arrested in Southport on suspicion of unauthorised access to computer material, unauthorised access with intent to commit further offences, and threats to kill.”

Articles from Ohio Web Library: