Tag: Data

After The Facebook Scandal It’s Time To Base The Digital Economy On Private Ownership Of Data

The continuing collapse of public trust in Facebook is welcome news to those of us who have been warning about the perils of “data extractivism” for years.

It’s reassuring to have final, definitive proof that beneath Facebook’s highfalutin rhetoric of “building a global community that works for all of us” lies a cynical, aggressive project – of building a global data vacuum cleaner that sucks from all of us.

Like others in this industry, Facebook makes money by drilling deep into our data selves – pokes and likes is simply how our data comes to the surface – much like energy firms drill deep into the oil wells: profits first, social and individual consequences later.




Furthermore, the rosy digital future – where cleverly customized ads subsidize the provision of what even Mark Zuckerberg calls “social infrastructure” – is no longer something that many of us will be taking for granted.

While the monetary costs of building and operating this “social infrastructure” might be zero – for taxpayers anyway – its social and political costs are, perhaps, even harder to account for than the costs of cheap petroleum in the 1970’s.

Such realizations, as sudden and shocking as they might be, are not enough. Facebook is a symptom, not a cause of our problems.

In the long run, blaming its corporate culture is likely to prove as futile as blaming ourselves.

Thus, instead of debating whether to send Zuckerberg into the corporate equivalent of exile, we should do our best to understand how to reorganize the digital economy to benefit citizens.

And not just a handful of multi-billion-dollar firms that view their users as passive consumers with no political or economic ideas or aspirations of their own.

The obstacles standing in the way of this transformative agenda are many and, worse, they are structural – not likely to be solved with a clever app.

These obstacles stem primarily from the disquieting dynamics of contemporary capitalism – which is more stagnant than our obsession with innovation and disruption betrays.

Rather than from our supposed addiction to social networking or tech companies’ abuse of that addiction.

Please like, share and tweet this article.

Pass it on: New Scientist

Watch Zuckerberg Testify Before Congress

Senator Leahy brought out this board and asked Zuckerberg if it specifically shows groups run by Russian operatives.

After yesterday’s seemingly endless marathon hearing before the Senate Judiciary Committee and the Senate Commerce, Science, and Transportation Committee, today Mark Zuckerberg heads to the House, where he’ll be answering questions in front of the Energy and Commerce Committee.

Yesterday’s session featured almost 50 legislators peppering Zuckerberg with queries about how Facebook safeguards user data, details on the Cambridge Analytica scandal.

And even questions about what kind of regulations Zuckerberg believes should be put in place to regulate Facebook.




The day ended with a number of revelations: Zuckerberg said that Facebook “doesn’t feel” like a monopoly to him, but he had a hard time naming a viable competitor.

He hinted that there may one day be a paid version of the platform; over and over, he promised senators that data privacy was a legitimate concern, and a priority for the platform.

Mark supports legislation to rein in Facebook’s data collection powers, but he had a hard time committing to supporting new laws that would do that.

This print shows how This Is Your Digital Life conflicted with the terms of service from Facebook at the time.

And he once again tried to tamp down suspicion that Facebook listened to our conversations through our phones.

We also learned that a lot of the legislators responsible for regulating Facebook don’t fully understand what Facebook is — or how it works.

Zuckerberg looked mostly comfortable and confident through the almost 5-hour long hearing. Let’s see how he holds up on day two.

Please like, share and tweet this article.

Pass it on: Popular Science

How HTTPS Website Security Is Making the Internet Safer From Hackers

You may have noticed in your travels around the internet that your browser’s address bar occasionally turns green and displays a padlock—that’s HTTPS, or a secure version of the Hypertext Transfer Protocol, swinging into action.

This little green padlock is becoming vitally important as more and more of your online security is eroded.

Just because your ISP can now see what sites you browse on doesn’t mean they have to know all the content your consuming.

Below is the rundown on HTTPS, so you can better understand this first, and easiest line of defense against potential snoopers and hackers.

HTTP or the Hypertext Transfer Protocol is the universally-agreed-upon coding structure that the web is built on.




Hypertext is the basic idea of having plain text with embedded links you can click on; the Transfer Protocol is a standard way of communicating it.

When you see HTTP in your browser you know you’re connecting to a standard, run-of-the-mill website, as opposed to a different kind of connection, like FTP (File Transfer Protocol), which is often used by file storage databases.

The protocol before a web address tells your browser what to expect and how to display the information it finds. So what about the extra S in HTTPS?

The S is simple. It means Secure.

It originally stood for Secure Sockets Layer (SSL) which is now part of a broader security protocol called Transport Layer Security (TLS).

TLS is part of the two layers that make up HTTPS, the other being traditional HTTP.

TLS works to verify that the website you’ve loaded up is actually the website you wanted to load up—that the Facebook page you see before you really is Facebook and not a site pretending to be Facebook.

On top of that, TLS encrypt all of the data you’re transmitting (like apps such as Signal or WhatsApp do).

Anyone who happens across the traffic coming to or from your computer when it’s connected to an HTTPS site can’t make sense of it—they can’t read it or alter its contents.

So if someone wants to catch the username and password you just sent to Google, or wants to throw up a webpage that looks like Instagram but isn’t, or wants to jump in on your email conversations and change what’s being said, HTTPS helps to stop them.

It’s obvious why login details, credit card information, and the like is better encrypted rather than sent in plain text—it makes it much harder to steal.

In 2017, if you come across a shopping or banking site, or any webpage that asks you to log in, it should have HTTPS enabled; if not, take your business elsewhere.

Check the details of the app listing and contact the developer directly if you’re worried about whether your connection to the web really is secure inside a mobile app.

So if HTTPS is so great, why not use it for everything? That’s definitely a plan.

There is now a big push to get HTTPS used as standard, but because it previously required extra processing power and bandwidth, it hasn’t always made sense for pages where you’re not entering or accessing any sensitive information.

The latest HTTPS iterations remove most of these drawbacks, so we should see it deployed more widely in the future—although converting old, large sites can take a lot of time.

If you want to stay as secure as possible, the HTTPS Everywhere extension for Chrome and Firefox makes sure you’re always connected to the HTTPS version of a site, where one has been made available, and fixes a few security bugs in the HTTPS approach at the same time.

It’s well worth installing and using, particularly on public Wi-Fi, where unwelcome eavesdroppers are more likely to be trying to listen in.

HTTPS isn’t 100 percent unbeatable—no security measure is—but it makes it much more difficult for hackers to spy on and manipulate sensitive data as it travels between your computer and the web at large, as well as adding an extra check to verify the identity of the sites you visit.

It’s a vital part of staying safe on the web.

Please like, share and tweet this article.

Pass it on: Popular Science

This City In Alaska Is Warming So Fast, Algorithms Removed The Data Because It Seemed Unreal

Last week, scientists were pulling together the latest data for the National Oceanic and Atmospheric Administration’s monthly report on the climate when they noticed something strange: One of their key climate monitoring stations had fallen off the map.

All of the data for Barrow, Alaska — the northernmost city in the United States — was missing.

No, Barrow hadn’t literally been vanquished by the pounding waves of the Arctic Sea (although it does sit precipitously close).




The missing station was just the result of rapid, man-made climate change, with a runaway effect on the Arctic.

The temperature in Barrow had been warming so fast this year, the data was automatically flagged as unreal and removed from the climate database.

It was done by algorithms that were put in place to ensure that only the best data gets included in NOAA’s reports.

They’re handy to keep the data sets clean, but this kind of quality-control algorithm is good only in “average” situations, with no outliers. The situation in Barrow, however, is anything but average.

If climate change is a fiery coal-mine disaster, then Barrow is our canary. The Arctic is warming faster than any other place on Earth, and Barrow is in the thick of it.

With less and less sea ice to reflect sunlight, the temperature around the North Pole is speeding upward.

The missing data obviously confused meteorologists and researchers, since it’s a record they’ve been watching closely, according to Deke Arndt, the chief of NOAA’s Climate Monitoring Branch.

He described it as “an ironic exclamation point to swift regional climate change in and near the Arctic.

Just this week, scientists reported that the Arctic had its second-warmest year — behind 2016 — with the lowest sea ice ever recorded.

The announcement came at the annual meeting of the American Geophysical Union, and the report is topped with an alarming headline: “Arctic shows no sign of returning to reliably frozen region of recent past decades.

Changes in the Arctic extend beyond sea ice. Vast expanses of former permafrost have been reduced to mud. Nonnative species of plants, types that grow only in warmer climates, are spreading into what used to be the tundra.

Nowhere is this greening of the Arctic happening faster than the North Slope of Alaska, observable with high-resolution clarity on NOAA satellite imagery.

The current observed rate of sea ice decline and warming temperatures are higher than at any other time in the last 1,500 years, and likely longer than that,” the NOAA report says.

At no place is this more blatantly obvious than Barrow itself, which recently changed its name to the traditional native Alaskan name Utqiagvik.

In just the 17 years since 2000, the average October temperature in Barrow has climbed 7.8 degrees. The November temperature is up 6.9 degrees.

The December average has warmed 4.7 degrees. No wonder the data was flagged.

The Barrow temperatures are now safely back in the climate-monitoring data sets. Statisticians will have to come up with a new algorithm to prevent legitimate temperatures from being removed in the future.

New algorithms for a new normal.

Please like, share and tweet this article.

Pass it on: Popular Science

Too Much Big Data May Not Be Enough

In the quest to mine and analyze meaningful, reliable, and useful data from the burgeoning plethora of electronic and online sources, healthcare organizations can allow the big picture to overshadow many underlying and valuable components contributing to patient care improvement.

The clinical data and diagnostic images in radiology information systems (RIS) and picture archiving and communication systems (PACS) remain two examples.

For clinical imaging and radiology executives, these visual clues and cues are necessary for effective, efficient decision support.

Certainly a growing number of manufacturers and information technology companies recognize this – even if many healthcare providers have not yet reached the point where they can tackle the necessary underlying infrastructure beyond the planning and strategic stages.

As a result, they’re offering providers a light at the end of the tunnel.




The latest generation of reporting capabilities can help improve the utilization of imaging data for diagnostic decision making,” says Cristine Kao, Global Marketing Director for Healthcare Information Solutions, Carestream.

An NIH study concluded that oncologists and radiologists prefer quantitative reports that include measurements as well as hyperlinks to annotated images with tumor measurements, for example.

A report by Emory and ACR shows eight out of 10 physicians will send more referrals to facilities that can offer interactive multimedia reporting – citing the ability to better collaborate with radiologists.

Connecting all of the technology and tools remains important, too, for a visually rich information view, according to Todd Winey, Senior Advisor, Strategic Markets, InterSystems.

For the clinical and diagnostic data to play a more valuable role in patient care improvement, these trends need to be accelerated, Winey insists, which isn’t without challenges.

VNAs remain only marginally deployed,” he laments. “Many of the advances in radiology information systems and PACS have been focused on productivity improvements for radiologists and are not yet fully supporting advanced interoperability.

Kao agrees with the foundational importance of a VNA but adds that it shouldn’t stop there.

Depending on an organization’s capabilities, imaging data must be accessible to more than just one clinical segment to be included as part of the decision support process, according to Winey.

Kao says she fully anticipates future reporting functions may include “more intuitive searching capabilities that will link pertinent patient information for a specific condition or disease, even if previous reports did not include the specific word involved in the search command.”

“The goal for enhancing the entire diagnostic process is to provide clinically relevant information when and where it’s needed.”

“New advanced reporting techniques provide information that can lead to improved decision support and diagnostic outcomes.

Please like, share and tweet this articles.

Pass it on: New Scientist