beSpacific - Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002
POGO – “Congress is considering a simple but important step in overseeing federal agencies. A recently introduced bill would require a one-stop, easy-to-use, online location for all congressionally mandated reports. This may put an end to the world of lost and hidden government reports. Each year, Congress mandates that federal agencies report on programs, laws, and other aspects of government, big and small. Whether it’s an analysis of Medicare’s ability to provide health care to seniors, the price impact of agricultural subsidies, problems with the Navy’s aircraft carrier program, or Amtrak’s ability to keep the trains running on time, Congress wants to know. In fact, agencies complete several thousand congressionally mandated reports annually in order to keep both elected officials and the public informed. Of course, government reports are intended to shine a light on government operations and national issues, but in an odd and persistent twist, Congress, the press, and the public can’t always find the reports after they are published. Surprisingly, no government agency or congressional office currently has the job to keep track of the reports. Instead, each agency has its own system of issuing and transmitting reports. Major reports of national and political focus are closely tracked and covered in the press. However, those that are less notable, but still important, may slip between the bureaucratic cracks…”
“For many years, the scientific community has been wondering—and often worrying—about the extent to which the public trusts science. Some observers have warned of a “war on science,” and recently some have expressed concern about the rise of populist antagonism to the influence of experts. But public confidence in the scientific community appears to be relatively strong, according to a nationally representative survey of adults in the United States by the Pew Research Center in 2016. Furthermore, scientists are the only group among the 13 institutions covered in the General Social Survey conducted by the National Opinion Research Center where public confidence has remained stable since the 1970s. However, this favorable attitude is somewhat tepid. Only four in 10 people reported a great deal of confidence in the scientific community. A series of other Pew Research Center studies, however, have revealed that public trust in scientists in matters connected with childhood vaccines, climate change, and genetically modified (GM) foods is more varied. Overall, many people hold skeptical views of climate scientists and GM food scientists; a larger share express trust in medical scientists, but there, too, many express what survey analysts call a “soft” positive rather than a strongly positive view. There are, of course, important differences in opinions about scientists in each of these domains. For example, people’s views about climate scientists vary strongly depending on their political orientation, consistent with more than a decade of partisan division over this issue. But public views about GM food scientists and medical scientists are not strongly divided along political lines. Instead, views about GM food issues connect with people’s concerns about the relationship between food and health; most people are skeptical of scientists working on GM food issues and are deeply skeptical of information from food industry leaders on this issue. On the other hand, older adults, people who care more about childhood vaccine issues, and those who know more about science are, generally, more trusting of medical scientists working on childhood vaccine issues than are other people…”
“A NOAA-sponsored report shows that the warming trend transforming the Arctic persisted in 2017, resulting in the second warmest air temperatures, above average ocean temperatures, loss of sea ice, and a range of human, ocean and ecosystem effects. Now in its 12th year, the Arctic Report Card, released today at the annual American Geophysical Union fall meeting in New Orleans, is a peer-reviewed report that brings together the work of 85 scientists from 12 nations. While 2017 saw fewer records shattered than in 2016, the Arctic shows no sign of returning to the reliably frozen region it was decades ago. Arctic temperatures continue to increase at double the rate of the global temperature increase. One chapter in the Arctic Report Card shows, using historical data, that the current observed rate of sea ice decline and warming temperatures are higher than at any other time in the last 1,500 years, and likely longer than that. The Arctic Report Card provides an annual update on how the region is faring environmentally, and compares these observations to the long-term record. This information can be used to inform decisions on adaptation by of local, tribal, state and federal leaders as they confront both the obstacles and the possibilities posed by a changing climate to economic growth, national security, public safety and natural resource conservation…”
“The American Bar Association Standing Committee on Ethics and Professional Responsibility has issued Formal Opinion 478 that provides the nation’s judicial branch guidance related to the ethical boundaries of independent factual research on the internet. The guidance is consistent with the ABA Model Code of Judicial Conduct, but notes that judicial notice is governed by the law of evidence in each jurisdiction. The opinion draws a bright-line distinction between independent investigation of “adjudicative facts” and research of “legislative facts” of law and policy. Formal Opinion 478 also provides guidance on internet research by judges of the lawyers and the parties involved in the case. “Stated simply, a judge should not gather adjudicative facts from any source on the Internet unless the information is subject to proper judicial notice,” Formal Opinion 478 said. “Further … judges should not use the Internet for independent fact-gathering related to a pending or impending matter where the parties can easily be asked to research or provide the information. The same is true of the activities or characteristics of the litigants or other participants in the matter.” The opinion provides five hypothetical situations, and provides an analysis of each and how they might be handled by a judge. The ABA Standing Committee on Ethics and Professional Responsibility periodically issues ethics opinions to advise lawyers, courts and the public in interpreting and applying ABA model ethics rules to specific issues of legal practice, client-lawyer relationships and judicial behavior. Formal Opinion 478 and previous ABA ethics opinions are available on the ABA Center for Professional Responsibility website under “Latest Ethics Opinions.” Go to www.abalegalfactcheck.com for the ABA’s new feature that cites case and statutory law and other legal precedents to distinguish legal fact from fiction.”
The Outline, Jon Christian: “Meet the man trying to catch Google search at its worst. Robert Epstein may be paranoid, but he is right when he says search engines should be kept in check. Earlier this week, we wrote about how Google can highlight erroneous or unconfirmed reports in the immediate aftermath of breaking news. But these rapidly-shifting results are quickly lost in time as the search engine’s algorithms self-correct, making it difficult for outsiders — including journalists — to hold the search engine accountable for spreading potentially harmful information. There is one group working on a concept for a system that would establish a record of search engine results. The idea is similar to the Internet Archive, which downloads periodic copies of websites, but more complicated since search engines display different results depending on the time as well as the location and history of the user. The solution for tracking such a complicated system is described in a prospectus for the Sunlight Society, founded by a group of 20 researchers under the banner of the American Institute for Behavioral Research and Technology (AIBRT), a nonprofit in Vista, California that conducts research in psychology and tech. The concept is similar to Nielsen Media Research’s longstanding system that collects information about audience size and demographics of television viewers through meters installed in households around the country. But instead of monitoring TV habits of real people, the system would monitor their internet use. This would require a worldwide network of paid collaborators who would provide the Sunlight Society with access to their search results…”
The New York Times: “Globally, we throw out about 1.3 billion tons of food a year, or a third of all the food that we grow. That’s important for at least two reasons. The less the world wastes, the easier it will be to meet the food needs of the global population in coming years. Second, cutting back on waste could go a long way to reducing greenhouse gas emissions. How do we manage to waste so much? Food waste is a glaring measure of inequality. In poor countries, most of the food waste is on the farm or on its way to market. In South Asia, for instance, half of all the cauliflower that’s grown is lost because there’s not enough refrigeration, according to Rosa Rolle, an expert on food waste and loss at the United Nations Food and Agriculture Organization. Tomatoes get squished if they are packed into big sacks. In Southeast Asia, lettuce spoils on the way from farms to city supermarkets. Very little food in poor countries is thrown out by consumers. It’s too precious…”
Grateful that there are really good Public TV shows still on the air, that are older than I am – so, here they are via 24/7 Wall St: ” To determine the longest running primetime TV shows of all time, 24/7 Wall St. developed a list of primetime television shows using the Internet Movie Database and other sources. Click here to see the full list of the longest running TV shows of all time. Six PBS programs are among the 10 longest running shows, and together they have logged a total of 231 years on the air. The science program “Nova” has been on television for 44 years and trails only “60 Minutes” (49 years) in terms of longevity. Other long-running shows on PBS are “The Victory Garden” (43 years); “This Old House” (39 years); “Nature” (35 years); “Frontline” (35 years); and “Wall Street Week” (35 years). Of those programs, only “Wall Street Week” is no longer on the air.”
Stunning, haunting, expansive, awe inspiring, magnificent, via The Atlantic – “National Geographic has announced the winners of its annual photo competition, with the Grand Prize Winner Jayaprakash Joghee Bojan receiving a prize of $7,500 for his image of an orangutan in Borneo. National Geographic was once again kind enough to let us display the winning images and honorable mentions here from the four categories: Wildlife, Landscapes, Aerials, and Underwater.”
Bryan Merchant: “There are some 269 billion emails sent and received daily. That’s roughly 35 emails for every person on the planet, every day. Over 40 percent of those emails are tracked, according to a study published last June by OMC, an “email intelligence” company that also builds anti-tracking tools. The tech is pretty simple. Tracking clients embed a line of code in the body of an email—usually in a 1×1 pixel image, so tiny it’s invisible, but also in elements like hyperlinks and custom fonts. When a recipient opens the email, the tracking client recognizes that pixel has been downloaded, as well as where and on what device. Newsletter services, marketers, and advertisers have used the technique for years, to collect data about their open rates; major tech companies like Facebook and Twitter followed suit in their ongoing quest to profile and predict our behavior online…”
“Recognising the many scientific, economic, and social benefits of more open science, research policy makers and funders around the world are increasingly likely to prefer or mandate open data, and to require data management policies that call for the long-term stewardship of research data. At the same time, there are ever more data being created and used within research, and access to data is playing an increasingly central role in many research fields. Indeed, there are a number of fields of research that depend almost entirely upon the availability of global data sources provided through research data repositories. As a result, repositories for the curation and sharing of research data have become a vital part of the research infrastructure. It is thus essential to ensure that these repositories are adequately and sustainably funded. However, relatively little work has been done to date on the revenue streams or business models that might provide ongoing support for research data repositories. This project was designed to take up the challenge and to contribute to a better understanding of how research data repositories are funded, and what developments are occurring in their funding. Central questions included:
- How are data repositories currently funded, and what are the key revenue sources?
- What innovative revenue sources are available to data repositories?
- How do revenue sources fit together into sustainable business models?
- What incentives for, and means of, optimising costs are available?
- What revenue sources and business models are most acceptable to key stakeholders?
Forty-eight structured interviews were undertaken with repository managers from 18 countries and a broad range of research domains. They provided insights into key issues, which were further elaborated in two international workshops involving a variety of stakeholders – including repository managers, funders, and policy analysts. There is a large variety of repositories that are responsible for providing long term access to data that is used for research. As data volumes and the demands for more open access to this data increase, these repositories are coming under increasing financial pressures that can undermine their long-term sustainability. This report explores the income streams, costs, value propositions, and business models for 48 research data repositories. It includes a set of recommendations designed to provide a framework for developing sustainable business models and to assist policy makers and funders in supporting repositories with a balance of policy regulation and incentives. The document can also be referenced at the permanent link: http://www.oecd-ilibrary.org/science-and-technology/business-models-for-sustainable-research-data-repositories_302b12bb-en“
Crowdsourcing Accurately and Robustly Predicts Supreme Court Decisions — By Daniel Martin Katz, Michael Bommarito, Josh Blackman – via SSRN
“ABSTRACT: Scholars have increasingly investigated “crowdsourcing” as an alternative to expert-based judgment or purely data-driven approaches to predicting the future. Under certain conditions, scholars have found that crowd-sourcing can outperform these other approaches. However, despite interest in the topic and a series of successful use cases, relatively few studies have applied empirical model thinking to evaluate the accuracy and robustness of crowdsourcing in real-world contexts. In this paper, we offer three novel contributions. First, we explore a dataset of over 600,000 predictions from over 7,000 participants in a multi-year tournament to predict the decisions of the Supreme Court of the United States. Second, we develop a comprehensive crowd construction framework that allows for the formal description and application of crowdsourcing to real-world data. Third, we apply this framework to our data to construct more than 275,000 crowd models. We find that in out-of-sample historical simulations, crowdsourcing robustly outperforms the commonly-accepted null model, yielding the highest-known performance for this context at 80.8% case level accuracy. To our knowledge, this dataset and analysis represent one of the largest explorations of recurring human prediction to date, and our results provide additional empirical support for the use of crowdsourcing as a prediction method.” (via SSRN)
“A Q&A with Tom Blanton, director of the National Security Archive, on the historical value of Hillary Clinton’s emails, the sins of Julian Assange, and what national secrets are really worth keeping. How much does it cost to keep a secret? Well, the U.S. government sort of has an answer: $16.89 billion. That’s how much it spent in 2016 to classify information that it deems too sensitive to be released to the public. Some secrets are worth keeping, of course — like how to cook up chemical weapons, for instance. But others, less so. Rodney McDaniel, a top National Security Council official during the administration of President Ronald Reagan, estimated that only 10 percent of classification was for the “legitimate protection of secrets.” Former New Jersey Governor Tom Kean, a head of the 9/11 commission, said that “three quarters of what I read that was classified should not have been.” In fact, he argued that overclassification had left the U.S. more vulnerable to the 9/11 attacks. And that’s to say nothing of its everyday effects on government accountability and efficiency, congressional oversight and public awareness…”
Washington Post Viz: What Tech World Did You Grow Up In? “In the past three decades, the United States has seen staggering technological changes. In 1984, just 8 percent of households had a personal computer, the World Wide Web was still five years away, and cell phones were enormous. Americans born that year are only 33 years old. Here’s how some key parts of our technological lives have shifted, split loosely into early, middle and current stages…”
MIT Technology Review – The Download: “If you have ever dealt with sexual harassment in the workplace, there is now a private online place for you to go for help. Botler AI, a startup based in Montreal, on Wednesday launched a system that provides free information and guidance to those who have been sexually harassed and are unsure of their legal rights. Using deep learning, the AI system was trained on more than 300,000 U.S. and Canadian criminal court documents, including over 57,000 documents and complaints related to sexual harassment. Using this information, the software predicts whether the situation explained by the user qualifies as sexual harassment, and notes which laws may have been violated under the criminal code. It then generates an incident report that the user can hand over to relevant authorities. Ritika Dutt, a cofounder of Botler AI, was stalked at a previous job, and the experience left her feeling trapped. “I didn’t know if it was something that was really going wrong or if I was just being overly sensitive about it,” Dutt says. “I didn’t know what my rights were, if he was breaking any laws, or how to deal with it.” Her personal experience—along with the recent string of allegations of sexual misconduct, particularly among powerful men in media, entertainment, and politics—motivated Botler AI to attempt to create an unbiased tool that can be a resource for the average person. “At the end of the day, hearing about all these cases angered me. On a personal level I got so annoyed and upset,” Dutt says. “How many people think they can do this and get away with this? Everyone should be able have a resource to go to get information and get educated without fear of judgment…”
National Academies of Sciences, Engineering, and Medicine. 2017. Knowledge Management Resource to Support Strategic Workforce Development for Transit Agencies. Washington, DC: The National Academies Press. https://doi.org/10.17226/24961.
“TRB’s Transit Cooperative Research Program (TCRP) has released a pre-publication, non-edited version of TCRP Research Report 194: Knowledge Management Resource to Support Strategic Workforce Development for Transit Agencies. The guidebook explores the importance of knowledge management (KM), which is an organization’s process for collecting, storing, and sharing organizational information and knowledge, and provides guidance on implementing KM strategies in transit agencies. In addition, the guidance includes action plans for developing particular aspects of KM, analysis of KM strategies at several transit agencies, and a catalog of KM technology tools and resources.”Topics
Wes Silver – Outside: “Human society is made possible by rules, both written and unwritten. Yet there’s no such series of concrete, accepted rules for dog owners, and that’s becoming a problem. Take my experience this past weekend. In need of a quick getaway, my girlfriend and I booked a room at the Kimpton Goodland, in Santa Barbara, California, with our two mutts. All Kimpton hotels are incredibly dog friendly, which makes the boutique chain a unique resource for dog owners. There’s no extra deposit or fees for dog owners, and the pups are allowed anywhere in the hotel. (Except for the restaurants.) It’s a unique opportunity to enjoy a nice hotel with your dogs. But this weekend, even we were annoyed with the behavior of other dog owners. Dogs locked in rooms unattended barked persistently. Owners let their small untrained and unsocialized pets bark at other guests in the lobby and hotel bar. Some took their dogs to the poolside lawn for bathroom breaks. Of course, Kimpton, and other dog-friendly businesses, has some basic guidelines for dog owners: pay for any damage the dogs cause, pick up after them, keep them under control. But rules like that are both vague and extremely basic. There’s no further instruction on how to behave in public with your dog from dog owner organizations like the American Kennel Club. While the AKC offers a Good Canine Citizen certification to the dogs themselves, it offers no guidance for owners. If we want to be able to take our dogs into more hotels and businesses, and if we want to be welcome in public places and in general get along with the rest of human society, then us dog owners need a rulebook—an agreed-upon set of behaviors that will allow us, as a community, to better share our limited resources and to interact with the non-dog-owning public in a way that’s positive for everyone. This is my best effort at setting those rules down in writing…”
Executive Compensation at Private and Public Colleges By Dan Bauman, Tyler Davis, Ben Myers, and Brian O’Leary December 10, 2017 – “The Chronicle‘s executive-compensation package includes the latest data on more than 1,200 chief executives at more than 600 private colleges from 2008-15 and nearly 250 public universities and systems from 2010-16. Hover over bars to show total compensation as well as pay components including base, bonus, and other. Click bars to see details including other top-paid college employees, how presidents compare with their peers, and how presidential pay looks in context to an institution’s expenses, tuition, and pay for professors. Updated December 10, 2017, with 2015 private-college data.”
EFF: “One of the most pernicious forms of censorship in modern America is the abuse of the court system by corporations and wealthy individuals to harass, intimidate, and silence their critics. We use the term “Strategic Lawsuit Against Public Participation,” more commonly known as a “SLAPP,” to describe this phenomenon. With a SLAPP, a malicious party will file a lawsuit against a person whose speech is clearly protected by the First Amendment. The strategy isn’t to win on the legal merits, but to censor their victims through burdensome, distracting, and costly litigation. SLAPP suits often make spurious defamation claims and demand outrageous monetary penalties to bully their enemies. In EFF’s work, we’ve seen SLAPPs deployed against journalists and bloggers, cartoonists, and even people who have posted reviews on websites like Yelp and eBay. They’ve been used by election power players against their political opponents and by corporations against non-profits whose job is to hold them in check. In fact, EFF faced such a scheme when an Australian company filed a lawsuit to censor one of our “Stupid Patent of the Month” articles. Although EFF won in court, the lawsuit required resources that we otherwise could have devoted to other battles…”
“The nation experienced an increase in commuting time, median gross rent and a rise in English proficiency among those who spoke another language. These are only a few of the statistics released today from the U.S. Census Bureau’s 2012-2016 American Community Survey five-year estimates data release, which features more than 40 social, economic, housing and demographic topics, including homeowner rates and costs, health insurance and educational attainment. “The American Community Survey allows us to track incremental changes across our nation on how people live and work, year-to-year,” said David Waddington, chief of the Social, Economic, and Housing Statistics Division. “It’s our country’s only source of small area estimates for socio-economic and demographic characteristics. These estimates help people, businesses and governments throughout the country better understand the needs of their populations, the markets in which they operate and the challenges and opportunities they face.” The survey produces statistics for all of the nation’s 3,142 counties. In addition, it is the only full dataset available for three-fourths of all counties with populations too small to produce a complete set of single-year statistics (2,322 counties). Each year, Census Bureau data helps determine how more than $675 billion of federal funding are spent on infrastructure and services, from highways to schools to hospitals…”
Retirement Benefits for Members of Congress, Katelin P. Isaacs, Specialist in Income Security. December 5, 2017: “…Under both Civil Service Retirement System (CSRS) and Federal Employees’Retirement System (FERS)., Members of Congress are eligible for a pension at the age of 62 if they have completed at least five years of service. Members are eligible for a pension at age 50 if they have completed 20 years of service, or at any age after completing 25 years of service. The amount of the pension depends on years of service and the average of the highest three years of salary. By law, the starting amount of a Member’s retirement annuity may not exceed 80% of his or her final salary…”