The Real Alternative

Leave a comment

Meet the Former Pentagon Scientist Who Says Psychics Can Help American Spies

Leave a comment

Women stress and the mind body connection

Surprising as it may seem, the intimate connection between the mind and the body was not well understood until the closing decades of the last century. In the early 1970’s, for example, a scientific research paper published in Scientific American was one of the first studies to scientifically investigate this connection. That particular paper studied what happened in the body when the mind was in a meditative state. The paper found that as the mind settled down with a specific, effective practice of meditation, the body gained a profoundly deep state of rest. Respiration settled significantly. Stress hormones in the blood were reduced. Skin resistance increased (an indicator of increased physiological relaxation). The paper was a landmark work in the scientific recognition of the mind/body connection.

Also in the 1970’s we were all becoming familiar with the concept of stress. Stress had been with us for a long time of course, but through the work of scientists such as Hans Selye stress was becoming a defined process. Hans Selye, an endocrinologist, became widely recognized as an expert in the field of stress management. Selye defined stress as the body’s nonspecific response to a demand placed on it. For example, if we are home alone and a strange noise is heard in another room, our heart rate may increase and probably our blood pressure too, adrenaline shoots up and our senses become heightened. These physiological changes are the result of what scientists call the fight or flight response. Such ancient mechanisms in the human physiology are meant to prepare one to either ‘fight’ in a challenging situation (e.g. the tiger in the path before us) or to remove oneself from danger. While these mechanisms may be useful in a specific challenge, occurring in the physiology on a sustained basis they can create the basis for a myriad of health problems.

Understanding stress and the close connection between mind and body spawned another level of discovery about health and disease in terms of psychosomatic disorders. Psychosomatic disorders result from the influence that the mind has over physical processes. A psychosomatic disorder is one in which a physical disease is thought to be caused, or made worse, by mental factors. Such physical diseases—including skin disorders, cardiovascular problems, respiratory disorders, and disturbances of the nervous system including multiple sclerosis—can be particularly aggravated by mental factors such as stress and anxiety.

Women are particularly susceptible to stress. Their lives are challenged by special stressors. Women often care for others much more than they care for themselves. They may push themselves hard in the juggling of professional and personal lives. Stress in women is also often caused by the constant array of hormonal changes occurring in the female physiology. It is important for women to understand how to maintain balance: how to nurture the connection between mind and body, and to avoid the accumulation of stress that can break this vital connection. To prevent the onset of psychosomatic disorders and to avoid the deleterious effects of stress, women can only gain by fostering a healthy mind/body connection.

-Lesley Goldman

Stress Reduction

Read More: Women stress and the mind body connection – Stress articles – Messaggiamo.Com

The Alchemist and the Sacredness of Following Your Dreams

The Alchemist, by Paulo Coelho, is an inspirational fable about a Spanish shepherd, Santiago, who sets out on a quixotic quest to find a treasure at the Egyptian pyramids. Along the way he learns various spiritual lessons.

Read More: The Alchemist and the Sacredness of Following Your Dreams

Leave a comment

Finding happiness without seeking

supreme happiness

supreme happiness (Photo credit: Wikipedia)

By Ankush Chauhan

Mostly we look for happiness in the outside world. What we do not know is that it lies inside us. In our own mind lies the secret to be happy.

I am a Zen master and I help people find happiness. A happiness which is not the opposite of sadness. Here in this article you will find the secret to my happiness which does not depend on outside situations.

We attach too much importance to the outside world. For an average person, things like a posh house, a well-paying job, a successful business, money, car etc. are the source of their happiness.

If they do not get it, they are unhappy. As simple as that. They have made themselves dependent on those things. In other words, they are attached.

According to Buddha, the reason for all troubles is ‘attachment’. Attachment to things, people, objects etc. bring sadness/unhappiness.

The interesting thing is that we spend almost 99 percent of our lives looking for happiness. And we believe that we are going to get it by chasing money, chasing success etc.

Whatever we do chasing those things, brings a lot of tension and unhappiness in our lives.

The more we exert ourselves seeking those things that we ‘think’ will bring happiness, the more we find ourselves in depression.


Here is the Zen buddhist approach:

Accept everything in your life. A total acceptance of everything: good, bad, ugly, beautiful, pain, pleasure is needed.

As soon as you start accepting everything that comes your way, you will live in the moment.

Living in the moment will bring about real happiness. The reason why it is called ‘PRESENT’ is that this moment you are living now is a Gift from nature/God.

The present moment is a gift. Once you begin cherishing everything in it, you will discover real happiness.

All the energy of the universe is concentrated on THIS moment now. Once you discover the hidden energy by living in the present moment, you will get everything you want.

Then your life be truly happy and blissful.

Fore more on this, read:

Source: Free Articles from


Ankush Chauhan is a Zen master who helps people realize the bliss in this moment! He blogs about his meditative experiences.

Hailing from a middle class family, Ankush now works full time helping people realize the Buddha that they are. The aim of Ankush is to bring more and more people to the world of bliss and joy that is the result of spiritual awakening.

Leave a comment

Can listening to music help you sleep?


Sleep (Photo credit: Wikipedia)

Joseph F Chandler, Birmingham-Southern College

By now, you’ve surely heard that Americans aren’t getting enough sleep.

In our always-on society, a solid chunk of nightly rest seems, well, like a dream. We shave the edges of sleep to keep up, exchanging extra waking hours for compromised health, productivity and safety.

Despite this, we actually know how to sleep better; the list of empirically supported, low-cost, simple behavioral tweaks is extensive, whether it’s avoiding alcohol as bedtime approaches or just going to sleep at a regular hour. Though changing habitual behavior is easier said than done, one of these tweaks may be as simple as putting in your earphones and pressing play.

Recently, British composer Max Richter released an eight-hour-long composition titled Sleep, which he has described as a lullaby, meant to be listened to while sleeping.

The composition ranges from sweeping, airy selections called Dream to the heavy, trance-inducing Space sequence. Indeed, it is an ambitious, impressive piece of conceptual art. But could it actually improve your sleep?

Conflicting results

Research on improving sleep with music is filled with methodological mistakes.

Self-reported sleep quality – the metric of choice for many music studies – often doesn’t correlate with objective measures of sleep: people will often think they’ve gotten a good night’s sleep (best defined as an unmedicated, uninterrupted night somewhere between seven and ten hours). But in many cases, they haven’t.

On the other hand, when objective measures are used (like the industry standard Polysomnography), true control groups (like a placebo group in a drug trial) are often left out.

With these drawbacks in mind, it’s easy to understand why the literature reads as equivocal. Some studies claim music can have a positive effect on sleep quality, while others cite no objective benefit.

A recent, methodologically sound meta-analysis reported an overall positive effect of music for improving sleep in those with a sleep disorder. This is promising, but even the article’s authors admit that more precise work is needed to reach a clear conclusion.

A carefully choreographed cycle

Perhaps the answer is hidden in a more basic question. Given the way sleep is structured, can music even influence it to begin with?

The answer is yes and no.

Sleep is not a gentle slide into unconsciousness. Rather, it’s a complicated ride into an alternate conscious state, where reality is actively created from internal information, rather than external sensation.

That transition from “outside” to “inside” happens in four distinct steps. The sleep process manifests as a non-REM (NREM) phase (which is divided into three parts: NREM 1, 2 and 3) and Rapid Eye Movement (REM).

Imagine you’ve turned on Richter’s full Sleep composition and have just gotten into bed. As your eyes get heavy and your attention wanders, you are entering early NREM 1 sleep. You are deeply relaxed. This lasts for a few minutes.

A selection from Sleep’s Dream sequence.

At this point, the research suggests that Richter’s work may be having an effect; anything that contributes to your relaxation will help induce NREM 1 sleep. Richter’s Sleep certainly has relaxing qualities, like many of the classical pieces often used in music and sleep research.

As you continue to relax, your brain begins to exhibit what are called “organized theta waves,” which slowly switch attention channels from the outside environment to internal cues. At this point, you may feel as if you’re floating or lightly dreaming; if someone says your name insistently enough you may still respond. This lasts about 10 minutes, after which K-complexes and sleep spindles appear in your brain wave pattern.

This is where it gets tricky. K-complexes and sleep spindles – brief bursts of high activity on an otherwise slowing brain wave pattern – actively shield external stimuli. That is, during this stage your brain purposefully blocks the reception of and response to outside sensory information.

This hallmark of NREM 2 sleep means that, for all intents and purposes, you are no longer hearing Richter’s work. The auditory cortex is still receiving the sounds, but the thalamus – essentially the call center of the brain – stops the signal before any memories or sense can be made of the music.

NREM 2 lasts for about 20 minutes. Then your brain waves become very slow and very organized. These are called delta waves, and they indicate NREM 3: a state of near-complete nonresponsiveness to the external world. After 30 minutes of NREM 3, you briefly travel back up into the lighter stages of sleep, at which point you may again hear the composition. In fact, if it’s loud enough, unusual ambient noises at this point may actual wake you up, disturbing the carefully choreographed cycle.

With time, all external stimuli slip away, and you recede into your dreams.

If you remain asleep, however, you quickly slip into the REM portion of the cycle: your body becomes paralyzed, and your external senses get rewired to pay exclusive attention to your memories. You are essentially awake, but feeding off an internally derived reality to create the crazy dreams associated with REM. At this point I could walk into your room, call your name loudly and leave without you even knowing I was there. In other words, the external world – including what is being piped through your headphones – doesn’t matter for those amazing few minutes of REM sleep.

As the night goes on, the cycle will repeat itself many times, and each time the proportion of REM will become greater. By the end of the night, you are spending most of your time in your own internally created universe, for which the current external world has no bearing. For a grand total of 60 minutes of the eight-hour period, you will be able to hear Mr Richter’s beautiful work. The rest of the time, only your memories matter.

So for all its merits, can Max Richter’s Sleep help you sleep? The answer is probably yes: it could make falling asleep easier. But you’ll be missing most of the show.

The Conversation

Joseph F Chandler, Assistant Professor of Psychology, Birmingham-Southern College

This article was originally published on The Conversation. Read the original article.

Leave a comment

Academic print books are dying. What’s the future?

Donald Barclay, University of California, Merced

The print-format scholarly book, a bulwark of academia’s publish-or-perish culture, is an endangered species. The market that has sustained it over the years is collapsing.

Sales of scholarly books in print format have hit record lows. Per-copy prices are at record highs. In purely economic terms, the current situation is unsustainable.

So, what does the future look like? Will academia’s traditional devotion to print and legendary resistance to change kill off long-form scholarship? Or will academia allow itself to move from print-format scholarly books to an open-access digital model that could save, and very likely rejuvenate, long-form scholarship?

Sales down. Prices up

First, let’s look at some of the sales trends. Take the book-centric academic field of history as an example.

In 1980, a scholarly publisher could expect to sell 2,000 copies of any given history book. By 1990, that number had plummeted to 500 copies. And by 2005, sales of a little over 200 copies worldwide had become the norm.

From my own field – library and information studies – the numbers are no less bleak. The editor of a major academic publishing house confided to me this summer that, circa 1995, he could expect to sell 1,000 copies of even a ho-hum library studies book during its first year of publication. In 2015, an outstanding book in the field is considered doing well if it manages to sell 200 copies in its first year.

In a classic response to a downward spiral, publishers ended up raising prices of scholarly books. In 1980, in the field of history,the average price for a hard cover history book was US$22.78; by 2010, that price had almost quadrupled to $82.65.

Similar increases were seen in every other academic field. The average price of a hardcover book on the subject of religion went from $17.61 in 1980 to $80.88 in 2010. For education, the price climbed from $17.01 in 1980 to $177.59 in 2010.

Libraries losing buying power

Neither an anomaly or a bump in the road, this total market collapse is the result of a long-term trend from which the print-format scholarly book cannot recover.

A root cause for this market collapse is the loss of buying power among academic libraries, traditionally the biggest customer for printed scholarly books.

Libraries have been hit by a double economic whammy of beyond-inflationary increases in the cost of journal subscriptions and an ongoing drop in governmental support for higher education in the past few decades.

As a result, academic libraries have been forced to choose between maintaining their paid subscriptions to journals, the favored information resource of the STEM fields, and scholarly books, the workhorse of the humanities and interpretive social sciences.

Here are some numbers that tell the story of where academic libraries have chosen to put their money:

In the mid-1980s, the ratio of spending on journal subscriptions compared to scholarly books was roughly 50-50. By 2011, that ratio had shifted to 75-25 in favor of subscriptions to academic journals.

The fact that only about half of the scholarly books in academic libraries are ever borrowed has further discouraged librarian investment in the scholarly book.

Changing nature of market

In any case, in a perfect ivory tower world, the economics of the print-format scholarly book would not be a consideration. After all, university presses were created for the specific purpose of publishing scholarship that, while rich in intellectual value, has little or no economic value.

However, in a higher-education environment in which the subsidies once enjoyed by university presses have shrunk or entirely vanished, editors are left with little choice but to consider sales potential before accepting a manuscript for publication.

Consequently, academic rigor aside, the market value of a scholarly book on a perennially popular historical figure like, say, Theodore Roosevelt or a current hot-button social issue such as racism is simply going to be more attractive to a scholarly publisher than a book on Spain’s Golden Age (Siglo de Oro) or land-ownership patterns in Hungary’s 12th-century Árpád Dynasty, whose sales prospects might be dismal.

Even so, in many academic fields the publication of scholarly books still remains the standard by which emerging scholars are credentialed. Is it acceptable that a PhD student in one of those fields might feel forced to choose a dissertation topic based on how a publisher views its sales potential as a book rather than on its contribution to the field?

Why not consider open access?

Bleak as it may seem, the good news is that this need not mean the end of long-form scholarship.

Facing a dismal market, a number of leading scholarly publishers are taking steps to change the economic model of the scholarly book. This change involves moving from a foundation in print to a foundation in digital, and from a focus on sales to libraries to a focus on open access.

What about going digital?
PRORob DiCaterino, CC BY

In a notable example, the University of California Press announced the publication this October of the first five titles as part of its Luminos initiative. <a href=”http://www.luminosoa.orgLuminos titles are fully peer-reviewed, professionally edited scholarly books initially published as open-access e-books with a print-on-demand option for those who prefer physical books.

Hardly a one-off venture, similar open-access models for publishing scholarly books are being implemented by such presses as The Ohio State University Press, Penn State Romance Studies, Amherst College Press, ANU (Australian National University) Press, De Gruyter Open, and others.

Open-access initiatives such as these are positioning themselves to disrupt the scholarly book market by shifting to a model in which the cost of publication is recouped by upfront underwriting rather than via sales of copies.

Besides rescuing the scholarly book from oblivion, open-access digital books offer many advantages over their print forebearers: The number of potential readers dwarfs what is possible for a run of a few hundred printed copies. Open-access scholarly books can be used, wholly or in part, as course texts at no costs to students.

Additionally, digital formatting loosens constraints on the number of pages and illustrations. Scholars are free to employ tools of digital-age scholarship ranging from timeline-enhanced maps to data visualizations to embedded video. Open-access books can be read in regions of the world where few people can afford First World price tags.

Academic distrust

However, open-access scholarly books can still fail if those senior faculty who make decisions about hiring, promotion, and tenure refuse to embrace it.

In my experience, many among the senior faculty harbor a lingering distrust of digital publication. Some faculty consider any underwriting of publication costs by the author and/or the author’s institution as nothing more than vanity press publication, where authors have to pay to get published.

For faculty who take this view, such new models of open-access publication are considered to be academic sins in the rank of plagiarism and “diploma mill” degrees.

In my view, there is no reason why scholarly books published under legitimate open-access models cannot undergo rigorous peer-review and editing processes. Quality peer review and editing are not, after all, functions of paper and ink.

Additionally, with very few exceptions, the cost of publishing a scholarly book has always been subsidized to one extent or another. Circa 1980, publication costs for a printed scholarly book were very likely underwritten by a university press’s campus subsidy.

Arguing that publishing a book under the auspices of a subsidized scholarly press occupies some higher moral ground than publishing under one of the emerging models of subsidized open-access publishing is entirely specious.

If, in the end, the forces of academic conservatism kill the open-access scholarly book by refusing to hire or reward emerging scholars who publish in this way, an unintended consequence will be the death of the scholarly book.

Will the academy stand by and allow the market to determine who succeeds and who fails as an academic? Or, will it move toward open-access publication that offers a viable alternative to a market in collapse?

The Conversation

Donald Barclay, Deputy University Librarian, University of California, Merced

This article was originally published on The Conversation. Read the original article.


How computers broke science – and what we can do to fix it

Photo – US Army (click for more historic computer images)

Ben Marwick, University of Washington

Reproducibility is one of the cornerstones of science. Made popular by British scientist Robert Boyle in the 1660s, the idea is that a discovery should be reproducible before being accepted as scientific knowledge.

In essence, you should be able to produce the same results I did if you follow the method I describe when announcing my discovery in a scholarly publication. For example, if researchers can reproduce the effectiveness of a new drug at treating a disease, that’s a good sign it could work for all sufferers of the disease. If not, we’re left wondering what accident or mistake produced the original favorable result, and would doubt the drug’s usefulness.

For most of the history of science, researchers have reported their methods in a way that enabled independent reproduction of their results. But, since the introduction of the personal computer – and the point-and-click software programs that have evolved to make it more user-friendly – reproducibility of much research has become questionable, if not impossible. Too much of the research process is now shrouded by the opaque use of computers that many researchers have come to depend on. This makes it almost impossible for an outsider to recreate their results.

Recently, several groups have proposed similar solutions to this problem. Together they would break scientific data out of the black box of unrecorded computer manipulations so independent readers can again critically assess and reproduce results. Researchers, the public, and science itself would benefit.

Computers wrangle the data, but also obscure it

Statistician Victoria Stodden has described the unique place personal computers hold in the history of science. They’re not just an instrument – like a telescope or microscope – that enables new research. The computer is revolutionary in a different way; it’s a tiny factory for producing all kinds of new “scopes” to see new patterns in scientific data.

It’s hard to find a modern researcher who works without a computer, even in fields that aren’t intensely quantitative. Ecologists use computers to simulate the effect of disasters on animal populations. Biologists use computers to search massive amounts of DNA data. Astronomers use computers to control vast arrays of telescopes, and then process the collected data. Oceanographers use computers to combine data from satellites, ships and buoys to predict global climates. Social scientists use computers to discover and predict the effects of policy or to analyze interview transcripts. Computers help researchers in almost every discipline identify what’s interesting within their data.

Computers also tend to be personal instruments. We typically have exclusive use of our own, and the files and folders it contains are generally considered a private space, hidden from public view. Preparing data, analyzing it, visualizing the results – these are tasks done on the computer, in private. Only at the very end of the pipeline comes a publicly visible journal article summarizing all the private tasks.

The problem is that most modern science is so complicated, and most journal articles so brief, it’s impossible for the article to include details of many important methods and decisions made by the researcher as he analyzed his data on his computer. How, then, can another researcher judge the reliability of the results, or reproduce the analysis?

Good luck recreating the analysis.
US Army

How much transparency do scientists owe?

Stanford statisticians Jonathan Buckheit and David Donoho described this issue as early as 1995, when the personal computer was still a fairly new idea.

An article about computational science in a scientific publication is not the scholarship itself, it is merely advertising of the scholarship. The actual scholarship is the complete software development environment and the complete set of instructions which generated the figures.

They make a radical claim. It means all those private files on our personal computers, and the private analysis tasks we do as we work toward preparing for publication should be made public along with the journal article.

This would be a huge change in the way scientists work. We’d need to prepare from the start for everything we do on the computer to eventually be made available for others to see. For many researchers, that’s an overwhelming thought. Victoria Stodden has found the biggest objection to sharing files is the time it takes to prepare them by writing documentation and cleaning them up. The second biggest concern is the risk of not receiving credit for the files if someone else uses them.

A new toolbox to enhance reproducibility

What secrets are within the computer?
US Army

Recently, several different groups of scientists have converged on recommendations for tools and methods to make it easier to keep track of files and analyses done on computers. These groups include biologists, ecologists, nuclear engineers, neuroscientists, economists and political scientists. Manifesto-like papers lay out their recommendations. When researchers from such different fields converge on a common course of action, it’s a sign a major watershed in doing science might be under way.

One major recommendation: minimize and replace point-and-click procedures during data analysis as much as possible by using scripts that contain instructions for the computer to carry out. This solves the problem of recording ephemeral mouse movements that leave few traces, are difficult to communicate to other people, and hard to automate. They’re common during data cleaning and organizing tasks using a spreadsheet program like Microsoft Excel. A script, on the other hand, contains unambiguous instructions that can be read by its author far into the future (when the specific details have been forgotten) and by other researchers. It can also be included within a journal article, since they aren’t big files. And scripts can easily be adapted to automate research tasks, saving time and reducing the potential for human error.

We can see examples of this in microbiology, ecology, political science and archaeology. Instead of mousing around menus and buttons, manually editing cells in a spreadsheet and dragging files between several different software programs to obtain results, these researchers wrote scripts. Their scripts automate the movement of files, the cleaning of the data, the statistical analysis, and the creation of graphs, figures and tables. This saves a lot of time when checking the analysis and redoing it to explore different options. And by looking at the code in the script file, which becomes part of the publication, anyone can see the exact steps that produced the published results.

Other recommendations include the use of common, nonproprietary file formats for storing files (such as CSV, or comma separated variables, for tables of data) and simple rubrics for systematically organizing files into folders to make it easy for others to understand how the information is structured. They recommend free software that is available for all computer systems (eg. Windows, Mac, and Linux) for analyzing and visualizing data (such as R and Python). For collaboration, they recommend a free program called Git, that helps to track changes when many people are editing the same document.

Currently, these are the tools and methods of the avant-garde, and many midcareer and senior researchers have only a vague awareness of them. But many undergraduates are learning them now. Many graduate students, seeing personal advantages to getting organized, using open formats, free software and streamlined collaboration, are seeking out training and tools from volunteer organizations such as Software Carpentry, Data Carpentry and rOpenSci to fill the gaps in their formal training. My university recently created an eScience Institute, where we help researchers adopt these recommendations. Our institute is part of a bigger movement that includes similar institutes at Berkeley and New York University.

As students learning these skills graduate and progress into positions of influence, we’ll see these standards become the new normal in science. Scholarly journals will require code and data files to accompany publications. Funding agencies will require they be placed in publicly accessible online repositories.

Example of a script used to analyze data.
Author provided

Open formats and free software are a win/win

This change in the way researchers use computers will be beneficial for public engagement with science. As researchers become more comfortable sharing more of their files and methods, members of the public will have much better access to scientific research. For example, a high school teacher will be able to show students raw data from a recently published discovery and walk the students through the main parts of the analysis, because all of these files will be available with the journal article.

Similarly, as researchers increasingly use free software, members of the public will be able to use the same software to remix and extend results published in journal articles. Currently many researchers use expensive commercial software programs, the cost of which makes them inaccessible to people outside of universities or large corporations.

Of course, the personal computer is not the sole cause of problems with reproducibility in science. Poor experimental design, inappropriate statistical methods, a highly competitive research environment and the high value placed on novelty and publication in high-profile journals are all to blame.

What’s unique about the role of the computer is that we have a solution to the problem. We have clear recommendations for mature tools and well-tested methods borrowed from computer science research to improve the reproducibility of research done by any kind of scientist on a computer. With a small investment of time to learn these tools, we can help restore this cornerstone of science.

The Conversation

Ben Marwick, Associate Professor of Archaeology, University of Washington

This article was originally published on The Conversation. Read the original article.


Get every new post delivered to your Inbox.

Join 867 other followers