Friday, November 23, 2018

Point Nemo - Spacecrafts watery graveyard!

Point Nemo (48°52.6'S 123°23.6'W) is located over 2,688 km (1,670 mi) equidistantly from the coasts of three far-flung islands.

In the pack ice of the Arctic ocean, at the greatest distance from any sushi is the Northern pole of inaccessibility. The distance to the geographic North pole is 661 km, to Cape barrow, Alaska is 1453 km, while the nearest Islands, Elsmere and of Franz-Joseph — 1094 km.

The location of the South pole of inaccessibility is not precisely determined: they must be the point in Antarctica furthest from the coastline of the southern ocean, however, scientists have not come to a consensus about how to understand the word “coast” as applied to this region.

Continental pole of inaccessibility — the place on land, the most remote from the oceans. This is the point in Eurasia, in Northern China (coordinates 46°17' North latitude. 86°40' e). The nearest shorelines — 2645 km.

Finally, the Oceanic pole of inaccessibility is located in the South Pacific at coordinates 48°52' Yu. sh. 123°23' s. D. Researcher Hrvoje of Locatel (Hrvoje Lukatela) figured this point in 1992 with the help of computer simulation. It is also called Point Nemo (Point Nemo) — in honor of the captain
of Verne’s novels. This is the most remote from the sushi place in the ocean. The nearest land, an uninhabited Atoll Ducie, lies 2 km.

It’s so desolate area that there is almost no fauna species: the strongest currents allow to survive the bacteria. Therefore, a space Agency, use this section of the ocean as a dumping ground: it is believed that the damage to people and nature from a distance will be minimal. Point Nemo is buried at least hundreds of decommissioned spacecraft and their parts.

Where spaceships go to die: Nasa's watery graveyard in the South Pacific 1,600 miles from land is now the resting place for 260 sunken craft
By Harry Pettit For Mailonline
Published: 23 October 2017


NASA has a 'spacecraft cemetery' where it buries used satellites by crashing them into a remote region in the Pacific Ocean.

'Point Nemo' (Latin for 'no one'), also known as the Oceanic Pole of Inaccessibility, is more than 1,600 miles from any spot of land.

The graveyard has amassed the remains of at least 260 crafts - mostly Russian - since it was first used in 1971, and helps to stop Earth from amassing too much dangerous orbiting space junk.

The spot's remoteness helps agencies avoid dangerous crashes.

Smaller satellites will burn up but pieces of the larger ones will survive to reach the Earth's surface. To avoid crashing on a populated area they are brought down near the point of oceanic inaccessibility.

The graveyard sits in South Pacific between Australia, New Zealand and South America.

Due to oceanic currents, the region is not fished because few nutrients are brought to the area, meaning marine life is scarce.

Agencies time their craft to a controlled entry above the region to make sure they land in the remote zone.

The spacecraft 'buried' there, which include a SpaceX rocket, several European Space Agency cargo ships, more than 140 Russian re-supply crafts, and the Soviet-era MIR space station, never reach the site in one piece.

SPACE JUNK

The graveyard helps to avoid the build-up of dangerous orbiting space junk above Earth.

But the growing amount of fast-moving space debris could lead to catastrophic collisions with satellites. There are an estimated 170 million pieces of so-called 'space junk' - left behind after missions that can be as big as spent rocket stages or as small as paint flakes - in orbit alongside some £920 billion ($700 billion) of space infrastructure. But only 22,000 are tracked, and with the fragments able to travel at speeds above 27,000kmh (16,777 mph), even tiny pieces could seriously damage or destroy satellites.

One future visitor to this desolate place will be the International Space Station. Current plans are for it to be decommissioned in the next decade and it will have to be carefully brought down in the oceanic pole of inaccessibility.

Columbus and the potato that changed the world...

A salute and a toast is due to Columbus. The Columbian Exchange wrought both good and bad effects in its wake, but on the whole his discoveries resulted in knitting the world together and, in the long run, the world we live in today.

Columbus did not introduce slavery in the Americas. Long before Columbus, the Aztec empire (Mexica) made slaves of the neighboring tribes they conquered, and they sacrificed thousands of their conquered subjects on their temples with priests wearing the flayed skins of previous sacrificial
beings. In 1519, when Hernando Cortez conquered the Aztecs (Mexica) he did so with some 30,000 Aztec subjects who wanted to free themselves from Aztec dominance.
What Columbus, and the Conquistadors who followed, did to the Indians in the Americas was no different than what they were doing to their fellow Indian subjects. There are those who give the Aztecs a pass because, so it is said, they were acting withing their era and culture. Fair enough, but the same must be said for the Spanish Conquistadors; they, too, were acting within their era and culture.
Columbus was a first-rate mariner, jokes about him thinking he had discovered India or Japan (Cipangu) notwithstanding. While latitude could be calculated, longitude could not. Yet he navigated his way to the New World. We owe Columbus a great debt for his efforts.

This thoughtful article dodges one of the most important exchanges. Syphilis was probably a New World disease, and Columbus's "gift" to Europe. There were major and deadly epidemics following Columbus's various journeys. There may have been a similar disease in the Mediterranean world, but nothing like the variety that reduced Europe's population so radically. Native Americans, who had lived with syphilis for long periods, were somewhat immune to it, unlike Europeans.


Christopher Columbus and the potato that changed the world
Steve Hendrix, 8 Oct, 2018

It was a small round object sent around the planet, and it changed the course of human history.

Call it “Spudnik.” It was a potato.

On Columbus Day, the country commemorates the grand global changes — discoveries and destruction alike — that unfolded after Christopher Columbus linked the New World and the Old. But some scholars take a more granular view of what Columbus wrought. They look at the very seeds, seedlings and tubers that began crisscrossing the oceans in what they call the “Columbian Exchange.”

The potatoes, tomatoes, corn, peppers, cassava and other plants native to the Americas did more than enliven the cook pots of Europe, Africa and Asia.

They transformed cultures, reshuffled politics and spawned new economic systems that then, in a globalizing feedback loop, took root back in the New World, as well.

It was a grand shuffling of organisms with results both great and disastrous: Malaria-fighting quinine from the South American cinchona tree aided European colonization throughout the tropics; the ballast dumped in Virginia by ships picking up tobacco introduced earthworms to the Mid-Atlantic.

Diseases common in the Old World quickly devastated the indigenous populations in the New.

“What happened after Columbus,” writes science journalist Charles Mann in “1493,” his book on the topic, “was nothing less than the forming of a single new world from the collision of two old worlds — three, if one counts Africa as separate from Eurasia.”

The potato alone gets credit for population booms in parts of northern Europe that paved the way for urbanization and, in turn, fueled the Industrial Revolution. Tobacco had such value it was used as currency in some places. Some American foods became staples abroad, from the tomato in Italy and
cassava in Africa to the peppers that became the paprika of Hungary and the curries of India.

“There really was no spicy food in the world before the Columbian Exchange,” said Nancy Qian, an economics professor at Northwestern University who has studied how the back-and-forth flow of new foods, animals and germs reshaped the world.

Researchers don’t know what use indigenous Americans made of the capsicum peppers that originated in Bolivia and Brazil. But as they spread around the globe, the zesty pods that are the ancestor of modern bell, cayenne and jalapeño peppers allowed cooks to conceal the tastes of foods that were still edible but going a bit off. Soon peppers would form the base of dishes around the warmer latitudes, from Vietnamese pho to Mexican salsa.

By far the most consequential transfer of organisms, Qian said, was the introduction of unknown pathogens into the defenseless populations of the Americas. In the first century-and-a-half after Columbus, smallpox, measles, whooping cough, typhus and other infectious diseases killed up to 80
percent of native people, according to demographer Noble David Cook. And when Europeans introduced sugar, cotton and other plantations to the Americas, they enslaved more than 12 million Africans to work them.

[Slavery’s bitter roots: In 1619, ‘20 and odd Negroes’ arrived in Virginia]

On the other side of the Atlantic, fewer cataclysmic shifts occurred when new species arrived. None had more impact than the potato, Qian said.

Before Columbus landed on Hispaniola, the European diet was a bland affair. In many northern climes, crops were largely limited to turnips, wheat, buckwheat and barley. Even so, when potatoes began arriving from America, it took a while for locals to realize that the strange lumps were,
comparatively speaking, little nutritional grenades loaded with complex carbohydrates, amino acids and vitamins.

“When [Sir Walter] Raleigh brought potatoes to the Elizabethan court, they tried to smoke the leaves,” Qian said.

Eventually, starting with a group of monks on Spain’s Canary Islands in the 1600s, Europeans figured out how to cultivate potatoes, which form a nutritionally complete — albeit monotonous — diet when combined with milk to provide vitamins A and D. The effects were dramatic, boosting populations in Ireland, Scandinavia, Ukraine and other cold-weather regions by up to 30 percent, according to Qian’s research. The need to hunt declined and, as more land became productive, so did conflicts over land.

Frederick the Great ordered Prussian farmers to grow them, and the potato moved to the center of European cultures from Gibraltar to Kiev. "Let the sky rain potatoes,” Shakespeare wrote in "The Merry Wives of Windsor.” Their portability made them ideal to transport into the growing cities, feeding the swelling population that would be needed for a factory labor force.

“It’s hard to imagine a food having a greater impact than the potato,” Qian said.

Cassava, which remains the foundation of many African diets, had a similar nutritional impact as it spread from the Americas. Sweet potatoes, too, proved hardy in flood-prone fields. In China, some scholars credit the sweet potato with reducing the frequent uprisings against emperors, whom peasants tended to blame when floods destroyed their rice crops.

Some of the most notable additions to global cuisine are nutritionally neutral: chocolate (made from cacao beans); vanilla (which was first processed to improve the flavor of chocolate); and the tomato, a native of the Andes that had been transported to Mexico. There, according to Mann, “native plant
breeders radically transformed the fruits, making them bigger, redder, and, most important, more edible." The result would transform the cuisine of Italy and bestow upon the world pizza, ketchup and the Bloody Mary.

“We don’t need them to survive,” Qian said. “But I don’t want to imagine a world without tomatoes and chocolate.”

How Bosses Waste Their Employees’ Time

Managers are often oblivious to the impact of their words and actions. Here’s how they can open their eyes.

How Bosses Waste Their Employees’ Time
By Robert I. Sutton, Aug. 12, 2018

Leaders don’t mean to waste their employees’ time. Unfortunately, many of them heap unnecessary work on the people below them in the pecking order — and are downright clueless that they’re doing it.

They give orders without realizing how much work those directives entail. They make offhand comments and don’t consider that their employees may interpret them as commands. And they solicit opinions without realizing that people will bend over backward to tell them what they want to hear—rather than the whole truth, warts and all.

That is what my Stanford colleague Huggy Rao and I have learned from our “organizational friction” project. We’re studying why some organizations make the right things too difficult to do and the wrong things too easy to do — and what leaders can do to avoid such missteps.


The roots of waste

Before describing how to avoid it, it’s important to understand why so many leaders are blind to the ways they waste employees’ time.

First, many bosses don’t pay enough attention to followers’ behaviors, needs and troubles.

The CEO of one firm I studied, for instance, fell in love with new management concepts, such as “lean” operations, and frequently announced new company-wide initiatives — often once a quarter. But those announcements typically didn’t take into account initiatives from previous quarters. Employees were often asked to drop what they were doing before and start a new mission from scratch.

Each new initiative entailed a new round of training, meetings and paperwork. Even though many employees learned the fine art of “fad surfing”—that is, complying with the changing directives as little as possible and focusing on their core work—they still wasted a lot of time.

At many companies, meanwhile, employees become aware of how self-absorbed their bosses are,
and so focus on telling the bosses what they think the bosses want to hear, and on doing things they believe will keep their bosses happy. This leads to what Dr. Rao and I call “executive magnification,” when people bent on buttering up a leader react far more strongly to his or her words or actions than
the leader ever intended.

For example, an executive told me a story, perhaps apocryphal, about a CEO who commented that there were no blueberry muffins at a breakfast meeting. He wasn’t especially fond of them; it was just small talk. After that, his staff sent strict instructions about this preference to every host. It took him years to discover why there were piles of blueberry muffins every place he went.

In another case, an executive asked his workers why there was a new door in one room. His people took it as a criticism, so they plastered and painted it over to please him. When he explained

that he had not meant it as complaint, they put the door back in.


Not all smiles


Executive magnification can generate far more troubling waste. In the 1980s, a co-researcher and I studied a retail chain that spent millions on improving employee courtesy. They used training, incentives and contests to encourage clerks to offer smiles, eye contact, greetings and thanks to customers.

This campaign was launched, in part, because the CEO complained about a rude clerk he encountered. It took a couple of years before he realized his brief rant had triggered a big campaign that he never wanted—and he ordered the company to wind it down.

Another way that executives waste employees’ time, slow the work and add to their own burdens is by “cookie licking,” a term inspired by sneaky children who lick cookies to deter others from eating them.

Cookie licking happens when leaders of growing companies don’t realize or accept that the time has come to delegate responsibilities. For instance, it made sense for one CEO we know to interview every job candidate when her company had 25 employees—but not when it grew to over 500.

Yet she insisted on doing so even though scheduling interviews placed enormous burdens on her
assistant and human-resources staffers, as her schedule became more packed. The company also lost several top prospects, who accepted jobs elsewhere before interviews with the CEO could be scheduled.

A year too late, the CEO decided she was too busy to interview every candidate. But she remained
oblivious to how her actions had burdened colleagues and driven away candidates.


Listening to criticism

How can leaders stop making these mistakes? How can they recognize that they have created an atmosphere where wasting time is more the norm than the aberration?

They can start by being skeptical when they hear nothing but sunny feedback from followers. They should also be vigilant about their minor complaints and offhand remarks. When they say anything that could be misconstrued as a command or desire for change, it helps to add, “Please don’t do anything, I am just thinking out loud.”

And when leaders encourage candor and criticism from employees, they should make sure it isn’t just lip service, and back it up with actions.


A film director we interviewed described how he was dissecting the flaws in a scene for his team. Then one member sighed.

The director called him on it, and the man mumbled that he didn’t have anything to add. But when the director nudged him to speak, he made a suggestion about changing the scene that the director praised and implemented.


A radical change

As part of embracing complaints, leaders might consider a radical (and often uncomfortable) change in how they define star employees.

Research on psychological safety led by Amy Edmondson at the Harvard Business School shows that the best employees for promoting organizational learning are often those who never leave well enough alone, pointing out mistakes and flawed practices. But those who management rates as
top performers are often those who silently do what they’re told and what has always been done — and don’t annoy their superiors with complaints and questions about flawed practices.

My work with Dr. Rao reveals similar problems: Employees who start big programs are often celebrated, but rarely those who end old, obsolete and ineffective programs and practices. And managers who lord over big teams and keep adding underlings are rewarded with prestigious titles and big raises—even when their ever-expanding army of bureaucrats adds unnecessary rules and procedures that sap time and energy from people who do the most important work.

Instead, the best leaders discourage this addition sickness by praising, promoting and paying employees who remove destructive friction and waste.

As with most positive steps, playing the subtraction game is much like mowing a lawn. Leaders can’t just do it once and declare victory. They have to do it on a regular basis, or else the old bad habits will creep back into place.


***
Dr. Sutton is a professor in the department of management science and engineering at Stanford University and co-author of “Scaling Up Excellence.” He can be reached at reports@wsj.com

Why Do We Call Computer Glitches "Bugs"?

Why Do We Call Computer Glitches "Bugs"?
by Mae Rice, September 7, 2018

When something goes wrong with a computer, why do we call it a "bug"? Sure, bugs can be unpleasant, but so are rashes, all-hands meetings, and overly-scented candles. Popular lore has it that society as we know it settled on "bug" because renowned computer scientist Grace Hopper once found an actual moth in a computer. But while that did happen, the legend behind the term isn't quite right.

Grace Hopper was part of the tiny team that created the world's first programmable computer: Harvard's Mark I. That wasn't her only first: She was also the first woman to receive a Ph.D. in mathematics from Yale, she helped create the first compiler for computer languages, and she was the first woman to receive the National Medal of Technology. It's no wonder that people called her "Amazing Grace." Today, she's commemorated with the annual Grace Hopper Celebration of women in programming.

The best-known origin story for the computer term "bug" goes like this: Back in 1943, Hopper was working for the U.S. Navy while the country was in the thick of World War II. The stakes were high, and there was a glitch in Mark I that was hard to track down, given that Mark I was the size of a room. Eventually, though, Hopper found the problem: a moth stuck in the inner workings. She smooshed the moth's corpse in her notebook and wrote next to it in an entry dated September 9, "First actual case of bug being found." This, according to the Navy's website, was the introduction of the term "bug."

But is it?

Factually, the story mostly holds up. She wrote the note, though she may or may not have discovered the moth, and it might have been the Mark II instead of the Mark I. The exact year in the 1940s is also up for discussion. However, the real controversy is that she wasn't coining the term "bug" so much as punning on it — it was already in use.

Where Did "Bug" Really Come From?

Well ... not from Hopper. Coining a term does usually require a little more explanation than Hopper included in her notebook, and Hopper's papers show that she and others had used the term for computer problems for several years previous to the moth incident. In fact, it even predates Hopper herself. According to the Oxford English Dictionary, it first appeared in 1889, in a newspaper description of Thomas Edison. (You know, the guy credited with the invention of the light bulb. And the telegraph. And the spirit phone, kind of.)

Back in 1889, a reporter for the Pall Mall Gazette wrote: "Mr. Edison ... had been up the two previous nights working on fixing 'a bug' in his phonograph — an expression for solving a difficulty, and implying that some imaginary insect has secreted itself inside and is causing all the trouble."

However, the term "bug" appears in Edison's private journals and letters as far back as 1876, long before this article went to print. It seems that in addition to inventing assorted technologies in his lab, Edison also invented the term "bug" and passed it on to this reporter. A prolific inventor indeed!

So how did he come up with the term? Computer World notes that it's sometimes traced back to an ancient word for monster, still visible in rarely-used words like "bugboo" (and, perhaps ... Destiny's Child's "Bug-a-Boo"?). However, Edison's coinage seems less about ancient history than about
literal bugs. He imagined little scapegoat bugs trapped in his glitchy machines. In an 1878 letter, he also notes that technological bugs "show themselves and months of intense watching, study and labor are requisite before commercial success or failure is certainly reached" — sort of like real bug infestations. You never notice roaches during the apartment viewing, after all; it's only once you move in that they reveal themselves.

So how did Hopper's moth end up taking so much credit? That may come down to Hopper herself. In the years that passed, she told and retold the tale of the moth in the machine, adding at one point, "From then on, when anything went wrong with a computer, we said it had bugs in it."

"Let me put it this way,"

Smithsonian's Peggy Aldrich Kidwell told the New York Times, "Dr. Hopper told a good story."

Buran - Soviet space shuttle story...

When the first was its last trip!
Wasteful adventure to copycat American behemoth space shuttle.

The Buran didn't have an escape system however it was fully automated to avoid carrying a crew. Only military missions were supposed to be manned. Buran was built as a counter to the US Shuttle which the soviet believed to be a purely military program soit was anticipated that the spaceship would need to carry a crew for orbit combat.

Soviet space shuttle: Learn how Buran made debut flight 30 years ago
RT : 15 Nov, 2018
Circling around the Earth twice, the first Soviet orbital spaceplane ‘Buran’ landed safely 30 years ago, all while having no crew inside. Its launch was the pinnacle of the nation’s space program but also ended up being its last.

By the mid-1970s, the Soviets were looking to create reusable orbiters for spaceflights, and that is how the idea of ‘Buran’ was born.

‘Buran’, which means ‘snowstorm’ or ‘blizzard’ in Russian, was seen as the direct rival to NASA’s Space Shuttle program.

The ship’s purpose was similar – transporting cosmonauts and cargo, while taxiing between Earth and the International Space Station. It was also intended to deliver satellites and smaller spacecraft into orbit. Buran itself was due to be launched into space via ‘Energia’ – the most powerful rocket the Soviet Union ever built.

The Energia-Buran project was considered one the most ambitious space-themed projects ever. Its
production and design efforts united around 1,300 different organizations across the whole country, involving more than 2.5 million people.

Just getting the spaceplane to a launch pad required unorthodox solutions. The shuttle was flown by gigantic cargo planes VM-T and An-225. And a brand new 4.5-kilometer landing strip was built at the Baikonur Cosmodrome to accommodate the shuttle on its way back from space.

The Soviets initially planned to launch Buran in 1984, but it took four more years to complete the first ship. The amount of work involved was massive. “The orbiter weighed 100 tons.

And the paper we had to waste on it weighed even more,” former deputy head of construction at the Energia-Buran project, Vyacheslav Filin told Gazeta.ru. “Every screw had its own blueprint.”

The launch date was moved several times, and finally, on November 15, 1988, Buran saw its first flight.

Buran’s success caused a sensation around the world, and the shuttle became one of the primary attractions at the prestigious aviation expo in Paris in 1989. However, the Soviet Union itself had only two years left to exist.

Its mission was unique because Buran had no crew inside. It circled twice around the Earth and landed safely, all while being controlled from the ground. The whole flight lasted almost four hours.

As the communist country broke apart and the economic crisis swept through Russia, Moscow had no resources to maintain such a grandiose and expensive project. All planned Buran flights were

cancelled, and the program itself was officially suspended in 1993. But the unique success of Buran’s maiden flight remains a majestic legacy to this day.

Thursday, November 8, 2018

Who Invented the Scientific Method?

You've Probably Never Heard of Him.

Who Invented the Scientific Method?
Written by Reuben Westmaas
January 9, 2018


History is full of forgotten heroes. Sometimes that's because somebody else got credit for their work. Sometimes it's because they were women in a male-dominated world. And sometimes it's because a couple of continents *cough* Western society *cough* decided they didn't want to include them in the history books. Meet Ibn al-Haytham — the guy who basically invented Science with a capital S.

A Little Method, A Little Madness

Ibn al-Haytham was born in what is now Basra, Iraq, sometime around 965 C.E., and around the dawn of the 11th century, he moved to Cairo, where he'd do his most influential work.

Though the details are a bit fuzzy, we know two things about his time in Egypt. First, his intellectual prowess was immediately apparent to everyone around him. And second, he had a habit of biting off more than he could chew and angering the wrong people.

His first major project was a dam that would have regulated the flooding of the Nile, but it didn't take long for him to realize that it was impractical. He then got a plum administrative job that didn't go especially well either — and the Caliph wasn't pleased. According to a 13th-century account of how things went down, al-Haytham feigned madness to protect himself from the wrath of the ruler, who settled on placing him under house arrest instead of ordering a more ... dramatic retirement.

As it turned out, a comfortable prison full of scientific texts and tools was exactly what al-Haytham needed. Over the next decade, he proved that light travels in a straight line, he demonstrated how mirrors work, and he made the compelling (and correct) argument that light bends when it travels through water.

But probably his most impressive contribution to science was, well, science. Or at least, the scientific method. See, he didn't want to just tell the world what he'd found. He wanted to show the world how he found it — and he wanted the world to try out his experiments for themselves. So he meticulously documented his experiments in his 40+ academic works, which ranged in subject from the behavior of light to the motion of the planets to architecture and engineering.

The Giant Newton Stood On

These days, Ibn al-Haytham isn't exactly a household name in the West. But go back a couple hundred years, and Europeans were a little more willing to listen. His most influential work, the Book of Optics, was translated into Latin about 100 years after his death and proved to be a top-seller among Europe's favorite thinkers. Roger Bacon, Johannes Kepler, and even Leonardo da Vinci
read his thoughts, although they mispronounced his name as "Alhazen" instead.

In fact, the "Selenographia," a 17th-century treatise on the nature of the Moon, put al-Haytham on the frontispiece alongside crowd favorite Galileo Galilei. There's no doubt that European science was transformed by al-Haytham's meticulous approach — and it's about time he starts getting his due.

Yogurt Actually Good For You

An R.D. explains everything you need to know about your favorite snack.

Is Yogurt Actually Good For You?
By Colleen de Bellefonds   
Nov 2, 2018

Yogurt is probably one of your favorite go-to breakfasts or snacks. It’s a good source of protein—important for strong muscles and bones—and packs a ton of gut-healthy probiotics. But some yogurts tend to contain a lot of sugar. So is it really that healthy after all?

The short answer is yes. “Considering it’s packed with probiotics, calcium, potassium, and protein, yogurt is one of the healthiest foods you can eat,” says Karen Ansel, M.S., R.D.N.

Below, she details everything you need to know about the health benefits of yogurt.
Greek yogurt vs. regular yogurt: Which one is better?

In some ways, Greek yogurt is actually pretty similar to regular yogurt, nutritionally-speaking. (You can see how eight ounces of plain, low-fat yogurt and plain, low-fat Greek yogurt compare in the infographic)

But Ansel says Greek yogurt does have a lot more protein than regular yogurt (23 grams versus 12 grams). It’s also generally lower in carbs and sugar.

And while 16 grams of sugar per serving of regular yogurt looks pretty high, keep in mind that all yogurt naturally has some sugar. Ansel says that natural sugar is balanced out by all the protein, calcium, and potassium that’s packed in there, too.

However, flavored yogurts (whether they’re regular or Greek) often contain added sugars and sweeteners that will take the sugar counts way up. Skip those and add your own in the form of fruit, cinnamon, honey, or maple syrup if needed.

The downside of Greek yogurt is that processing (specifically straining, which gives Greek yogurt its unique, thick texture) removes roughly half of the calcium from Greek yogurt, per the Harvard School of Public Health. Many brands add a calcium supplement back in, but check the label to be sure.

Otherwise both types have all of the other same health benefits, so Ansel suggests choosing whichever you enjoy eating most.


What are the actual health benefits of yogurt?

To be clear, you can get way more from a cup of yogurt than just calcium and protein. It also contains “good bacteria” that support your gut and immune system. “[Probiotics] have been credited with everything from improving digestion, to boosting immune health, to protecting against depression,” says Ansel.

It gets better: A 2012 study of over 120,000 people who weren’t obese and didn’t suffer from chronic disease found that regularly eating yogurt might protect against weight gain, possibly due to changes in gut bacteria.

Plus, some studies have suggested that four weeks of regularly eating probiotic yogurt is good for your brain, while another large study credited the healthy bacteria in yogurt for lowering risk of heart attack and stroke among people who ate just two servings a week. Not bad, not bad at all.
What’s better: low-fat, non-fat, or full-fat yogurt?

Research has shown that full-fat dairy isn’t actually bad for you, and that there’s no significant increase in your risk of heart disease and stroke. Plus, the full-fat stuff will keep you fuller longer, so you’ll be less likely to overeat during the course of your day. Non-fat options, meanwhile, often come with lots of extra sugar to mimic flavor. So stick with plain, full-fat yogurt and top with your favorite fruits for flavor.

Building technique of Egyptian pyramids

Speculative although sounds very credible...

Is this how Ancient Egyptians built the pyramids?

A RAMP is discovered in a 4,500-year-old quarry, a graphic reveals the complex system of pulleys used to drag huge stone blocks hundreds of feet.
  •     Archaeologists found ancient ramp system at site in Egypt's Eastern Desert
  •     Slope is lined with two staircases and wooden poles where ropes would be tied
  •     Researchers say this would lightened the load for workers dragging huge blocks
By Harry Pettit For Mailonline and Cheyenne Macdonald
Daily Mail (UK): 7 November 2018

A new graphic reveals the complex system of ramps and pulleys that may have been used by the Egyptians to construct the ancient pyramids.

It follows the recent discovery of Ancient Egyptian stone working ramps dating back 4,500 years in an alabaster quarry in the country's eastern desert.

The system raised stone blocks weighing several tonnes hundreds of feet into the air via enormous sleds, archaeologists believe.

This same technology may have allowed the Egyptians to haul blocks up steep inclines to build the Great Pyramid - the only surviving Wonder of the World.

The ancient ramp was discovered in Hatnub quarry by researchers from the French Institute for Oriental Archaeology in Cairo and the University of Liverpool.

It was flanked by two staircases lined with post holes, to which ropes were tied to drag the huge stone blocks.

Workers walked up the staircases on either side of the block, pulling the rope as they went, a system that alleviated some of the burden for the huge load.

The large wooden posts, which measured up to one-and-a-half feet (0.5 metres) thick, were key to the system, researchers said.

They allowed teams of workers to pull from below while others hauled the block from above.

It meant the ramp was inclined at double the angle that would have been considered possible, given the weight of the stones that workers were lifting.

'The arrangement allows people to be spaced up and down the ramp, and all the force to be exerted in the same direction,' Dr Roland Enmarch, of the University of Liverpool, told the Times.

The finding is significant because the stones lifted from the quarry would have been roughly the same size as those used in building the Great Pyramid.

Ramps used in construction were removed following completion, meaning the techniques used by the Egyptians to build the huge pyramid remain a mystery.

Researchers said the discovery is the first of its kind, and shows clear indication that it dates 'at least to Khufu's reign' – for whom the 481-foot Great Pyramid was built.

They argued it was 'plausible' to infer that the system discovered at Hatnub was the same as that used to build the Great pyramid.

'This shows that at the time the Great Pyramid was being built, this technology was also being used,' Dr Enmarch said.

Archaeologists had long assumed that the Egyptians used ramps to build the Great pyramid, but how that ramp system worked has remained a mystery for centuries.

It was long thought the ramps would need to lie at an angle of ten per cent at most to allow workers to drag the blocks so high.

This would have meant the ramps stretched long into the desert.

But the new find shows the ramps had an incline of up to 20 per cent, which they managed via the complex pulley system.

'This system is composed of a central ramp flanked by two staircases with numerous post holes,' Dr Yannis Gourdon, co-director of the joint mission at Hatnub, told Live Science.

'Using a sled which carried a stone block and was attached with ropes to these wooden posts, ancient Egyptians were able to pull up the alabaster blocks out of the quarry on very steep slopes of 20 percent or more.'

Wednesday, November 7, 2018

The Unraveling of America

Republican or Democrat....it doesn't matter. At the end, the people who suffer the most are Middle/Working Class, tax paying, LEGAL US citizens/residents. The two party system is designed to have us squabbling with each other over petty differences while the zio loving politicians &
their corporate sponsors rob the American taxpayer & America itself, blind.

We live in an age of rigged polls and corporate journalists taking assignments from paying clients on their political reporting. And then of course we have social media, one of the biggest election “interferers” of all.

Election 2018 and The Unraveling of America. The Great Distraction. Rising Social Inequality



If they’re Millennials, they may be considering whether even to vote or not, since neither wing of the corporate Party of America—aka Republicans or Democrats—have done much for them over the past ten years. Burdened mostly with low paying service jobs and $ trillion dollar student debt payments that consume roughly 37% of their paychecks, with real incomes well below what their parents were earning at their age, and with prospects for the future even more bleak, many Millennials no doubt wonder what’s in it for them by voting for either party’s candidates.
Will Millennial youth even bother to turn out to vote? As an editorial in the Financial Times business newspaper recently noted, “Only 28% of Americans aged 18 to 29 say they are certain to vote this November”. Political cynicism has become the dominant characteristic of much of their generation—deepening since the politicians’ promises made in 2008 have failed to materialize under Obama and now Trump.
If they’re Latinos and Hispanics, as they go to the polls they are aware their choice is either Trump Republicans who consider them enemies, criminals and drug pushers; or Democrats who, in the past under Obama, deported their relatives in record numbers and repeatedly abandon programs like DACA (‘Dreamers’) as a tactical political necessity, as they say. Who will they trust least? One shouldn’t be surprised if they too largely sit it out, harboring a deep sense of betrayal by Democrats and concern they may soon become the next ‘enemy within’ target of Trump and his White Nationalist shock troops who are being organized and mobilized behind the scenes by Trump’s radical right wing buddy, Steve Bannon, and his billionaire and media friends.
If they’re African Americans, they know from decades of experience that nothing changes with police harassment and murders, regardless which party is in power.
If they’re union workers in the Midwest, they know the Democrats are the party of free trade and job offshoring, while Republicans are the party favoring low minimum wages, elimination of overtime pay, privatization of pensions, and cuts to social security.
All these key swing groups of Millennials, Hispanics, African-Americans, and union workers in the midwest—i.e. those who gave Obama an overwhelming victory in 2008, gave him one more chance in office in 2012 despite failure to deliver, and then gave up on the unfulfilled promises in 2016—will likely not be thinking about the real ‘issues’ as they go to the polls. For the ‘Great Distraction’ is underway like never before.

The Great Distraction
It’s the ‘enemy within’ that’s the problem, we’re told by Trump. And the ‘enemy without’. Or, in the case of the immigrant—it’s both: the enemy without that’s coming in! So put up the barbed wire. Grab their kids when they arrive, as hostage bait. Send the troops to the border right now, to stop the hordes that just crossed into southern Mexico yesterday. Hurry, they’re almost here, rapidly proceeding to the US on foot. (They run fast, you see). They’re in Oaxaca southern Mexico. They’ll be here tomorrow, led by Muslim terrorists, carrying the bubonic plague, and bringing their knapsacks full of cocaine and heroin.
And if the enemy immigrant is not enough is not enemy enough, the ‘enemy within’ is increasingly also us, as Trump adds to his enemies list the ‘mob’ of Americans exercising their 1st amendment rights to assembly and protest against him. And don’t forget all those dangerous Californians who won’t go along with his climate, border incarceration, trade or other policies. Or their 80 year old Senator Diane Feinstein, their ring-leader in insurrection. They’re all the ‘enemy within’ too. The chant ‘lock ‘em up’ no longer means just Hillary. So Trump encourages and turns loose his White Nationalist supporters to confront the horde, the mob, and their liberal financiers like George Soros. If all this is not an unraveling, what is?
Not to be outdone in the competition for the Great Distraction, there’s the Democrats resurrecting their age-old standby ‘enemy without’: the Russians. They’re into our voting machines. Watch out. They’re advancing on Eastern Europe, all the way to the Russian-Latvian border. Quick, send NATO to the Baltics! Arrange a coup partnering with fascists in Ukraine! Install nuclear missiles in Poland! And start deploying barbed wire on the coast of Maine and Massachusetts, just in case.
However, behind all the manufactured fear of immigrants, US demonstrators, and concern about violence- oriented white nationalists whipped up and encouraged by Trump and his political followers—lies a deeper anxiety permeating the American social consciousness today. Much deeper. Whether on the right or left, the unwritten, the unsaid, is a sense that American society is somehow unraveling. And it’s a sense and feeling shared by the left, right, and center alike.
Both sides—Trump, Republicans, Democrats, as well as their respective media machines—sidestep and ignore the deep malaise shared by Americans today. Older Americans shake their heads and mumble ‘this isn’t the country I grew up in’ while the younger ask themselves ‘is this the country I’ll have to raise my kids in’?
There’s a sense that something has gone terribly wrong, and has all the appearance will continue to do so. It’s a crisis, if by that definition means ‘a turning point’. And a crisis of multiple dimensions. A crisis that has been brewing and growing now for at least a quarter century since 1994 and Newt Gingrich’s launching of the new right wing offensive that set out purposely to make US political institutions gridlocked and unworkable until his movement could take over—and succeeded. It’s a crisis that everyone feels in their bones, if not in their heads. The dimensions of the unraveling of America today are many. Here’s just some of the more important:

Growing Sense of Personal Physical Danger
Mass and multiple killings and murders are rampant in America today, and rising. So much so that the media and press consciously avoid reporting much of it unless it involves at minimum dozens or scores of dead. There are more than 33,000 gun killings a year in the US now. 90 people a day are killed by guns. While we hear of the occasional school shooting, the fact is there are 273 school shootings so far just in 2018. That’s one per school day.
The suicide rate in America is also at record levels, with more than 45,000 a year now and escalating. Teen age suicides have risen by 70% in just the last decade. The fastest rate of increase is among 35-64 year olds. People are literally being driven crazy by the culture, the insecurities, the isolation, the lack of meaningful work, the absence of community, and the hopelessness about a bleak future that they’re killing themselves in record numbers.

And let’s not forget the current opioid crisis. The opioid death rate now exceeds more than 50,000 a year. These aren’t folks over-dosing in back alleys and crack houses. These are our relatives, neighbors and friends. And the ‘pushers’ are the big pharmaceutical companies and their salespersons who pushed the Fetanyl and Oxycontin on doctors telling them it was safe—just like the Tobacco companies maintained for decades that cigarettes were ‘safe’ when their tests for decades showed their product produced cancer. Big Pharma knew too. They are the criminals, and their politicians are the paid-for crooked cops looking the other way. All that’s not surprising, however, since Big Pharma is also the biggest lobbyist and campaign contributor industry in the US.
So it’s 33,000 gun killings, 43,000 suicides, and 50,000 opioid deaths a year. Every year. That compares to US deaths during the entire 8 years of Vietnam War of 56,000! That’s a death rate over three years roughly equal to all Americans who died during the three and a half years of World War II! We all got rightly upset over 2500 killed on 9-11 by terrorists. But the NRA and the Pharmaceutical companies are the real terrorists here, and politicians are giving them a complete pass.
Instead of Big Pharma CEOs and leaders of the National Rifle Association (NRA), we’re told the real enemies are the desperate men, women and children willing to walk more than a thousand miles just to get a job or to escape gang violence. Or we’re told it’s the Russians meddling in the 2016 election and threatening our democracy—when the real threat to American democracy is home grown: In recent court-sanctioned gerrymandering; in mass voter suppression underway in Georgia, North Dakota, and elsewhere; in the billions of dollars being spent by billionaires, corporations, and their political action committees this election cycle to ensure their pro-business, pro-wealthy candidates win.
News of these real killing machines goes on every day, creating a sense of personal insecurity that Americans have not felt or sensed perhaps since the frontier settlement period in the 19th century. It’s not the immigrants or the Russians who are responsible for the guns, suicides, and drug overdoses. But they certainly provide a useful distraction from those who are. People feel the danger has penetrated their communities, their neighborhoods, their homes. But politicians have simply and cleverly substituted the real enemies with the immigrant, the mob, and that old standby, the Russians.

Income & Wealth Inequality Accelerating
Another dimension of the sense of unraveling is the economic insecurity that hangs like a ‘death smog’ over public consciousness since the 2008-09 crash. As more and more average American households take on more debt, work more part time jobs or hours, and adjust to a declining standard of living, they are simultaneously aware that the wealthiest 1% or 10% are enjoying income and wealth gains not seen since the ‘gilded age’ of the late 19th century. The share of national pre-tax income garnered by the top 10% has risen from 35% in 1980 to roughly 50% today. That’s 15% more to the top, equivalent to roughly than $3 trillion more in income gains by the top 10% that used to be distributed among the bottom 90%.
How could an America that once shared income gains from economic growth among its classes and across geography from World War II through the 1970s have now allowed this to happen, many ask? And why is it being allowed to get worse?
There are many ways to measure and show this economic unraveling. Whether national income shares for workers and wages falling from 64% to 56% of total national income; or the distribution to the rich of more than a $1 trillion a year every year since 2009 in stock buybacks and dividend payments; or the $15 trillion in tax cuts for investors, businesses, and corporations since 2001; or Trump’s recent $4 trillion tax windfall for the same; or stock market values tripling and quadrupling since 2009; or stagnant real wage gains for the middle class and declining real wages for those below the median.
Whatever dimension or study or statistic, the story is the same. Economic gaps are widening everywhere. And everyone knows it. And except for that noble, modern Don Quixote of American Politics, Bernie Sanders, it appears no one in either party is proposing to reverse it. So the awareness festers below the surface, adding to the realization that something is no longer right in America.

The sense of economic unraveling may have slowed somewhat after 2010, but it continues none the less, as millions of Americans are forced to assume low paying service jobs. Working two or more jobs to make ends meet. Taking Uber and gig work on the side. Going on Medicaid or foregoing health insurance coverage altogether. Moving to lower quality housing and taking on more room-mates. Treading economic water in good times, and sinking and gasping for air during recessions and in the bad times. Just making due. While the wealthy grow unimaginably wealthier by the day.

Never-Ending Wars
The sense of anxiety is exacerbated by the never ending wars of the 21st century. How is it they never end, given the most powerful military and funding of more than $1 trillion a year every year, it is asked?
Newspaper headlines haven’t changed much for 17 years. The war in Afghanistan and elsewhere continues. Change the dates and you can insert the same news copy. With more than 1000 US bases in more than 100 countries, America since 2001 has been, and remains, on a perpetual war footing. All that’s changed since 2000 is that the USA no longer pays for its wars by raising taxes, as it had throughout its history. Today the US Treasury and Federal Reserve simply ‘borrow’ the money from partners in empire elsewhere in the world—while they cut taxes on the rich at the same time.
And the annual war bill is going up, fast. Trump has increased annual spending on ‘defense’ by another $85 billion a year for the past two years. Approaching $150 billion if the notorious US ‘black budget’ spending on new military technology development—not indicated anywhere in print—is added to the amount. And more is still coming in the next few years, to pay for new cybersecurity war preparation, for next generation nuclear weapons, and for Trump’s ‘space force’. Total costs for defense and war—not just the Pentagon—is now well over $1 trillion annually in the US. And with tax cutting for those who might pay for it now accelerating, the only sources to pay for the trillion dollar plus annual US budget deficits coming for the next decade is either to borrow more or cut Social Security, Medicare, education and other social programs. And those cuts are coming too—soon if one believes the public declarations of Senate Republican Majority leader, Mitch McConnell.

Technology Angst
As our streets and neighborhoods become more dangerous, as inequality deepens, as wars, tax cuts for the rich and social program cuts for the rest become the disturbing chronic norm— awareness is growing that technology itself is beginning to tear apart the social fabric as well. Admitted even by visionaries and advocates of technology, the negatives of technology may now be outweighing its benefits.
Studies now show problems of brain development in children over-using hand-held screen devices. Excessive screen viewing, studies show, activates the same areas of the brain associated with other forms of addiction. Social media is encouraging abusive behavior by enabling offenders to hide. What someone would not dare to say or do face to face, they now freely do protected by space and time. Social media is transforming human communications and relations rapidly, and not always positively. It is also enabling the acceleration of the surveillance state. Massive databases of personal information are now accessible to any business, to virtually any governments, and to unscrupulous individuals around the globe intent on blackmail, threats, and worse. Privacy is increasingly a fiction for those participating in it.
And employment is about to become more precarious because of it. Technology is creating and diffusing new business models, destroying the old, and doing so far too rapidly to enable adjustment for tens of millions of people. Amazon. Uber. Gig economy. Wiping out millions of jobs, increasing hours worked, uncertainty of employment, lowering of wages. And next Artificial Intelligence. Projected by McKinsey and other business consultants to eliminate 30% of current jobs by the end of the next decade. Where will my job be in ten years, many now ask themselves? Will I be able to make it to retirement? Will there be anything like retirement any more after 2035?
Unchecked and unregulated accelerating technological change is adding to the sense of social unraveling of key institutions that once provided a sense of personal security, of social stability, of a vision of a future that seemed more related to the present, rather than to an even more anxiety ridden, uncertain, unstable future.

A Culture Increasingly Coarse & Decadent
When the President of the US brags he could shoot someone on the street corner and (his) people would still love him, such statements raise the ghostly spectre of prior decades when the vast majority of German people thought the same of Hitler. And when one of his closest advisers, Rudy Guliani, declares publicly that ‘Truth is not the Truth’, it amounts to an endorsement for an era of lies and gross misrepresentation by public figures. With chronic lying the political norm, what can anyone believe from their elected officials, many now ask? It’s no longer engaging in political spin for one’s particular policy or program. It’s politics itself spinning out of control. Public political discourse consists increasingly to targeting, insulting, vilifying, and threatening one’s political opponents. Trump’s railing against politicians and government itself smacks of Adolph’s constant insulting indictment of democratically elected Weimar German governments and leaders in the 1920s. It leaves the American public with a nervous sense of how much further can and will this targeting, personalizing, and threatening go?
But the political culture is not the only cultural element in decline. A broader cultural decline has become evident as well. Americans flock to view films of dystopia visions of America, of zombies, and ever-intense CGI violence where fictitious super heroes save the world. More of popular music has become overtly misogynous, angry, mean, and violent in both sound and lyrics. And has anyone recently watched how high schoolers now dance, in effect having sex with their pants on?

Collapse of Democratic Institutions
Not least is the sense of unraveling of political institutions and the practice of democracy itself. As a recent study estimated, Democracy is in decline in the US, having dropped in an aggregate score of 94 in 2010 to a low of 86 today—when measured in terms of free and fair elections, citizen participation in politics, protection of civil rights and liberties, and the rule of law. The study by the non-profit, Freedom House, concluded “Democracy is in crisis’ and under assault and in retreat.
In America, the restrictions on civil rights and liberties have been growing and deepening since 2001 and the Patriot Acts, institutionalized in annual NDAA legislation by Congress thereafter. Legislatures have been gerrymandered to protect the incumbents of both wings of the Corporate party of America. The US Supreme Court has expanded its authority to select presidents (Gore v. Bush in 2001), defined corporations as people with the right to spend unlimited money which it defines as free speech (Citizens United), and will likely next decide that Presidents (Trump) can pardon himself if indicted (thus ending the fiction that no one is above the law and endorsing Tyranny itself).
The two wings of the Corporate Party of America meanwhile engage in what is an internecine class war between factions of the American ruling class. More billionaires openly contest for office as it becomes clear millions and billions of dollars are now necessary to get elected.
Voter suppression spreads from state to state to disenfranchise millions, from Georgia to the Dakotas, to Texas and beyond. If one lacks a street number address, or an ID card, or has ever committed a felony, or hasn’t voted recently, or doesn’t sign a ballot according to their birth certificate name, or any other number of technical errors—they are denied their rights as citizens. What was formerly ‘Jim Crow’ for blacks in the South has become a de facto ‘Jim Crow Writ Large’ encompassing even more groups across a growing number of states in America.
A sense of growing political disenfranchisement adds to the feeling that the country is politically unraveling as well—-adding to the concurrent fears about growing physical insecurity, worsening economic inequality and declining economic opportunities, and an America mired in never ending wars. An America in which it is evident that political elites are increasingly committed to policies of redistribution of national wealth to the wealthiest. An America where more fear that technology may be taking us too far too fast. An America where the culture grows meaner, nastier and more decadent, where lies are central to the political discourse, and where political institutions no longer serve the general welfare but rather a narrow social and economic elite who have bought and captured those institutions.
And, not least, an America where politicians seem intent on drifting toward a nationalism on behalf of a soon to be minority White America—i.e. politicians who are willing to endorse violence and oppression of the rest in order to opportunistically assume and exercise power by playing upon the fears, anxieties, and insecurities as the unraveling occurs.
*
Dr. Rasmus is author of the forthcoming book, ‘The Scourge of Neoliberalism: US Policy from Reagan to Trump’, forthcoming 2019 by Clarity Press. He hosts the weekly radio show, Alternative Visions, on the Progressive Radio Network and blogs at jackrasmus.com. His twitter handle is @drjackrasmus. He is a frequent contributor to Global Research.