New Web Trend Sparks David vs. Goliath 2.0

Web giants mobilize to stop developers from 'scraping' their content
By Wesley Oliver,  Newser Staff
Posted Dec 31, 2007 4:14 PM CST
New Web Trend Sparks David vs. Goliath 2.0
Craigslist ordered one software developer to discontinue his website that pooled various information from the online community site onto one page. "They didn't even talk to me about trying to work something out," Ryan Sit says. "They just banned me."   (Getty Images)

Some call it “scraping,” others call it “importing.” Either way, it’s a controversial process pitting independent software developers against the titans of the cyber world: Techies compile, or scrape, loads of data from search engines and social networking sites and pool the data on their own websites, Wired reports. Some companies, relishing the increased traffic, love the service.

But others, charging copyright infringement, are cutting the web ties that allow developers access to their data and have begun developing ways to control how their content is distributed. Many companies like Google use interfaces that formally regulate how much data developers can use. One insider warns that large firms will begin competing with small developers, sparking “an unfair fight.” (More scraping stories.)

Get the news faster.
Tap to install our app.
X
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.

X