Vote Pins

FiveThirtyEight Model Gives Biden 40% Chance of Winning Nomination

As the first votes in the 2020 Democratic nomination process have not yet been cast, it’s impossible to predict with certainty who will emerge as the democratic nominee to face Trump in the general election, particularly considering the historically large field of candidates running for president this year. That being said, pollsters have worked tirelessly since the beginning of the primary season to measure voters’ preferences towards each of the candidates, generating a tremendous amount of data for analysts at organizations like FiveThirtyEight to sift through. Accordingly, FiveThirtyEight just published the first iteration of its forecast simulating the outcome of the primary season, which claims that Biden has a 2 in 5 chance of winning the nomination and Sanders has a 1 in 5 chance of winning, whereas Warren has a 1 in 8 chance and Buttigieg has a 1 in 10 chance, with all other candidates having just a 1 in 40 chance of winning the nomination.

Embed from Getty Images

The results of the study, which are based on computer simulations of the primary season that are run thousands of times based on data collected from polling organizations and models generated from an analysis of previous presidential nominations, were published in an interactive format that allows users to view the calculated probability of victory for each candidate in each state. Though FiveThirtyEight has analyzed political polls for more than ten years, this year marks the first time the ABC News-owned organization has published a “complete back-to-front model of the presidential primaries.” Despite the number of complexities involved, such as the difficult-to-predict impact of the winner of one state primary or caucus on future ones, the organization feels confident enough in the accuracy of its simulations to publish its findings even at this early stage in the process. One of the factors that led to the organization’s confidence this year is the amount of data collected on the primary processes of 2008 and 2016, which helps analysts understand the nuances of how presidential primaries tend to play out. The outcome of the Iowa caucuses, for instance, has historically had a tremendous impact on voters in the other 49 states.

The race is still very much up in the air

That being said, FiveThirtyEight founder Nate Silver stresses that their model is a “forecast, … not an estimation of what would happen in an election held today” and that the forecast is “probabilistic” with a high degree of uncertainty. As more political events shape voters’ opinions on the candidates, more polls are conducted, and the first states begin to hold primaries and caucuses, the organization will continue to refine their predictions and update their forecast. Silver also stresses that FiveThirtyEight’s predictions should be taken literally, meaning that although Biden is currently calculated to have the best chance of any of the candidates of winning the nomination, the probability of his victory is only 40%, making it actually more likely than not that one of the other candidates will win instead.

Embed from Getty Images

Although Trump’s surprise victory in 2016 led many observers to feel as though poll data is not to be trusted, as organizations like the New York Times had predicted with 85% certainty on the eve of the election that Clinton would win, FiveThirtyEight has a better track record than most organizations when it comes to the accuracy of its predictions. In 2016, FiveThirtyEight was far more pessimistic than most news outlets about the likelihood of a Clinton victory, giving the former First Lady a two-in-three chance of winning. As Nate Silver once commented, “one-in-three chances happen all the time;” when viewed from this perspective, it’s no surprise that Trump won in 2016, provided one has a realistic understanding of how to interpret the results of statistical models of probability. Accordingly, while Joe Biden has consistently led opinion polls since announcing his candidacy last year and has by far the highest probability of any candidate of winning the race for Democratic nominee, the race is still very much up in the air, as three other candidates stand a decent chance of victory as well.


Can “Virtual Travel” Replace Traditional Vacations?

While the desire to visit foreign countries and exotic locales is a near-universal human experience, it is also one that can only be realized by people with a certain amount of privilege. For one, traveling is expensive, and it requires a job that allows employees to be absent from work for several days at a time. And health issues like disabilities keep many would-be travelers stuck at home. Recently, environmental concerns have given tourists a bad name; flying by plane is considered one of the most environmentally-damaging ways to travel, and tourists often litter, much to the chagrin of local residents. As the global population expands, an increasing number of people are visiting vacation destinations, leading to overcrowding and worsening the problem of tourism for locals. As technology advances, though, the future of tourism may lie in virtual reality, as simulations of tourist experiences grow ever more realistic and immersive.

Embed from Getty Images

Long considered to reside squarely in the realm of futuristic sci-fi stories, virtual reality simulations of tourist destinations are already widespread in the form of 3D videos and games designed for use with headsets like the Oculus Rift. While impressive, these simulations don’t come close to replicating the experience of visiting a faraway destination in person, as they are limited to sights and sounds and generally offer users little to no freedom to shape the nature of their experience. All of this is set to change, however, as technology improves and developers invest more into expanding and refining these experiences.

Already, the travel industry is undergoing disruption thanks to the influx of technology like augmented-reality apps that help travelers determine whether their luggage will fit in the overhead compartment, and apps that allow users to preview restaurant meals by viewing 3D models superimposed on real-world objects. Some companies, like the airline KLM, are looking to entice tourists by offering vacationers a preview of their destination in the form of 3D 360 degree videos to be viewed with a virtual reality headset. While immersive, these experiences are not interactive, so their appeal in replicating the travel experience is limited.

Embed from Getty Images

Other companies, however, are looking to replace the travel experience altogether by incorporating more sophisticated elements, like computer graphics and interactivity, into their virtual-reality offerings. Similar programs already exist in the form of video games, which is currently the industry most heavily invested in virtual reality. With time, though, virtual reality headsets are likely to grow in popularity as they become more useful for medical, business, and educational purposes. One company that’s pushing the boundaries of virtual travel experiences is TimeRide, which offers a virtual reality experience in Berlin which allows customers to “experience the past directly” by wearing a virtual reality headset that shows images of the city’s past. Other companies are looking to entice customers by offering live, 360-degree videos recorded by drones exploring locations from around the world.

Despite these developments, though, virtual reality has a number of hurdles to overcome before it can truly replicate the travel experience. For one, virtual reality headsets only provide video and audio, whereas real-world traveling obviously incorporates all of a person’s senses. And while artificial intelligence and telecommunications technology has improved, it still cannot replicate the experience of meeting another human being face-to-face. Nevertheless, virtual reality technology promises to shape the future of the travel industry, and it has the potential to bring the joy of travel to millions of people who otherwise don’t have the means to experience it.

Global Warming

Climate Models Were Always Right on Global Warming

There are many people across the United States of America – as well as the world – who do not believe the argument for climate change, with many arguing that climate models are over-predicting how fast Earth is heating up.

With startling regularity, the claim continues to be argued that there is proof that climate change is not happening as fast as the experts claim. These claims are usually based on individual examples of data that may have been misinterpreted, even though over the years multiple studies have re-examined different climate models and continue to conclude that they are still working well.

A recent study has extensively investigated all global climate models that were released between the 1970s and 2007, which includes the ones that were used in the Intergovernmental Panel on Climate Change’s first three reports.

Lead study author Zeke Hausfather is a climate scientist at the University of California in Berkeley and has also worked alongside Tristan Abbott and Henri Drake – scientists from Massachusetts Institute of Technology – and Gavin Schmidt who is a scientist at NASA. Hausfather comments:

“It’s always a sign that you’re onto a good project when your first thought is, “Why hasn’t anyone done this before?” No one has really gone back and gathered all of the old model predictions that were in the literature, in part because climate models have changed a lot.”

Obviously many of these original models have become archaic with newer models replacing them. Yet even with advancing technology helping to make clearer projections, most of the early models had got their predictions of how warm the Earth would increase by right. In fact out of 17 models, only three were found not to be accurate.

Embed from Getty Images

However the latest study focuses on an often ignored, yet crucially important point regarding how climate models work. Each model has worked on the projections of future greenhouse gas emissions, which in turn predicts the level of warming expected.

But predicting most things can be difficult, especially when you try to predict carbon emissions, as there are many factors to take into consideration such as population growth and changes made within the energy landscape as well as economic shifts – all human factors making a difference to the natural world.

It was also suggested that in the past several of the models that had been criticized were actually pretty accurate as the simulation of the connection between greenhouse gases and temperatures were correct but the expectations regarding carbon emissions in the future were different to the emissions that were eventually created.

Basically, if scientists had entered the correct levels of greenhouse gas emissions when first creating the models, we would have seen the exact levels of future warming that they predicted.

During the 1980s James Hansen, a researcher for NASA, created a climate model that would eventually lead to him giving a congressional testimony on the threat of climate change. His testimony helped highlight climate change awareness throughout the world however due to his model over predicting warming by around 50% disbelievers were able to use his words as proof of global warming not being a real threat, due to their belief that scientists were prone to over exaggerate the facts.

Alongside his colleagues, Hausfather was quick to point out that the problem with his model was not the physics but the fact there was an assumption that there would be higher methane and chlorofluorocarbon emissions – both of which are dominant greenhouse gases – than there actually was.

Embed from Getty Images

One of the reasons for this over-projection was the fact that the model did not allow room for the impact the Montreal Protocol would have. The Montreal Protocol was a global agreement to phase out chlorofluorocarbons in an attempt to protect the ozone layer and potentially repairing the damage that was already caused.

As Hausfather said, “If you went back and reran that model with the actual levels of CO2 in the atmosphere and methane and chlorofluorocarbons, you would have gotten a value that was indistinguishable from the warming that we’ve actually observed.”

Even with these issues addressed there are still many obstacles for the next climate models such as making sure that all assumptions regarding greenhouse gas emissions in the future are accurate.

There will also be a need to look at specific physical processes on Earth that are difficult to understand, including clouds which have always been hard to represent, even though many scientists think they will be an important influence in regards to climate change. And as each model gets more and more detailed, the emphasis on making sure these details are improved upon will be significant.

In summary, the latest study implies that conclusions from previous models have been accurate for many years when it comes to global warming.

As Hausfather said, “they haven’t been overestimating warming, but at the same time it isn’t warming faster than we thought. It’s pretty much warming just as we thought it would.”

Flight Simulator

Microsoft Leverages Cutting-Edge Technology in Flight Simulator

Believe it or not, Microsoft’s longest-running franchise is its Flight Simulator series of titles which debuted in 1982 on IBM PCs. The latest release in the series, 2006’s Flight Simulator X, was widely believed to be the end of the franchise as Microsoft had not made any announcements on the subject for over a decade. That changed this year, however, as the software company announced they were working on a reboot of the franchise, simply titled Flight Simulator, which is due to release in 2020 on Windows and Xbox platforms. Though the program is still in the alpha stage of development, Microsoft showed off their progress at a press event in September, impressing journalists with the depth and complexity of various aspects of the simulation.

Perhaps the most immediate and visually striking aspect of the simulation’s design is the environment. In crafting a believable and lifelike world for users to explore, Microsoft leveraged its extensive collection of satellite and aerial imagery, creating 3D facsimiles of real-world locations using software. Whereas the environments in previous Flight Simulator titles were created by artists who placed individual objects in the virtual world, next year’s title uses machine learning to extract details about the real-world environment to represent in the simulation. For instance, the program identifies forests and other wooded areas and populates them with procedurally-generated trees and other fauna. The result is an environment so detailed and true-to-life that members of the press were even able to identify their apartment buildings while flying overhead in a small plane.

Embed from Getty Images

In addition to the terrain, the simulation also attempts to replicate real-world weather conditions, constructed from data collected by weather stations around the world. The system analyzes radar imagery and other measurements in order to reconstruct cloud patterns, rain, snow, and other weather events, allowing users to fly through storms and hurricanes in real-time. As volumetric clouds are rendered dynamically, either in accordance with conditions in real life or at the user’s behest, the scattering of light through the sky is also simulated, leading to realistic shadows, sunrises, sunsets, and even rainbows where they would appear in real life.

By leveraging an approach where the environment is procedurally-generated, the developers at Microsoft are able to graphically represent the entire world, a feat that would be next to impossible if the environment were crafted by hand. There exist 44,000 airports in the real world, and Microsoft is intending to replicate all of them in their simulation. That also means that the game’s file size, relative to other games, is absolutely massive – as such, content is dynamically streamed into the simulator from the Internet. An offline mode is also available, but this mode reduces the detail of the environments considerably, causing them to look more like the simple, flat satellite imagery used in 2006’s Flight Simulator X.

Aircraft reacts realistically to changes in weather and other variables, and photogrammetry technology was used to replicate the interior of planes’ cockpits.

While the program’s simulation of the entire Earth is impressive and perhaps unprecedented, perhaps the more important aspect of the simulation is found within its collection of aircraft. Microsoft is focused on creating as realistic an experience as they can, with an additional focus on accessibility, as the team is working to ensure that a wide range of peripherals, from flight sticks to the standard Xbox gamepad, are supported. Aircraft reacts realistically to changes in weather and other variables, and photogrammetry technology was used to replicate the interior of planes’ cockpits. To demonstrate the accuracy of the simulation, Microsoft even invited a journalist to pilot a real plane after learning how to do so using the simulator.

Microsoft has very ambitious plans for this iteration of Flight Simulator. The company intends to support the product with continued development for an impressive ten years after its release, though it admits it doesn’t know yet exactly what this support will entail. Instead, Microsoft is going to release the closed alpha to a select number of users to observe how they interact with the virtual environment, and focus continued development efforts on the areas that generate the most engagement. There is certainly a lot of potential for simulation opportunities with a foundation as strong as the one Microsoft is building, and the company is going to be looking to the community for direction moving forward.