Jan 22, 2016

Peanut Butter Day

Peanuts are native to the Americas and since Aztec times have been made into a paste to be eaten. Modern peanut butter originated in the late 1800’s with the first patent dating back to 1884, but it was much runnier than modern versions.

Dr. John Harvey Kellogg patented another paste in 1895 that is much more similar to what we see today and served it to patients at the Battle Creek Sanitarium as a health supplement. It was originally so expensive that it became a staple of luxury in the early 1900’s and was commonly served in upper class tearooms that populated New York and was paired with a wide array of foods such as cheese, celery, watercress, pimento, and crackers.

The first reference of peanut butter paired with jelly is from a recipe by Julia Davis in 1901, and by 1920 the sandwich caught the attention of less wealthy members of society and spread peanut butter around the nation.

As the price of peanut butter lowered, it became extremely popular with children and today it is one of the most widespread food items in America. In fact, the spread is so popular there is even a National Peanut Butter Day on January 24th.

Happiness is Physical and Emotional

Japanese researchers have mapped, using MRI where happiness emerges in the brain. The study, published in Scientific Reports, paves the way for objectively measuring happiness and provides insights on a neurologically based way of being happy.

A team at Kyoto University has found an answer from a neurological perspective. Overall happiness, according to their study, is a combination of happy emotions and satisfaction of life coming together in the precuneus, a region in the medial parietal lobe.

People feel emotions in different ways; for instance, some people feel happiness more intensely than others when they receive compliments. Psychologists have found that emotional factors like these and satisfaction of life together constitutes the subjective experience of being happy. The neural mechanism behind how happiness emerges, however, remained unclear. Understanding that mechanism will be a huge asset for quantifying levels of happiness.

Their analysis revealed that those who scored higher on the happiness surveys had more grey matter mass in the precuneus. In other words, people who feel happiness more intensely, feel sadness less intensely, and are more able to find meaning in life.

"Several studies have shown that meditation increases grey matter mass in the precuneus. This new insight on where happiness happens in the brain will be useful for developing happiness programs based on scientific research." Am thinking my precuneus must be enlarged, especially on a Happy Friday.

Every Day vs. Everyday

Every Day means each day. Everyday means commonplace, ordinary, typical.

Here are some examples for using every day and everyday correctly: Jane takes her dog out for a walk every day. It is important to floss every day.

Jack did not take very good care of his everyday shoes.

An "everyday occurrence" does not necessarily mean it occurs every day. It only means means it is an ordinary, commonplace occurrence. It is not something unusual. Everyday is an adjective, so it describes an attribute of the occurrence.

If something occurs daily, you say it "occurs every day" or that it is a daily occurrence. Since "every day" is an adverb, it cannot be used as an adjective to describe the occurrence.

Smart Light Bulbs

The next big deal may be the smart bulb. Sony last week launched a connected light bulb that contains everything needed for Artificial Intelligence. It goes on sale in Japan this year. Think of a smart house linked with a number of these lights in different rooms, unobtrusively sitting there, waiting for your beck and call.

Sony's Multifunctional Light works like other smart lights. It can be automated or can be controlled with a smartphone. It has built-in Wi-Fi and a dedicated app. It also has a motion-detector, brightness meter, temperature and humidity sensors, an infrared sensor and a memory card slot, plus a built-in speaker and microphone. Wouldn't it be nice to control the temperature for the room you are in vs. the temperature down the hallway.

Not a stretch to think of adding smoke/gas detectors. Wouldn't it be also be great to have it speak in addition to smoke alarm and call the fire department and send pictures for you. I imagine it could turn itself on or off under any range of circumstances, like if someone comes into a dark room or leaves a room. Nice to have a built in intercom so you no longer need to yell upstairs to bring more beer and chips. How about being able to replace the baby monitor with a smart monitor or having it turn on in the morning along with the alarm clock. Think of it turning the stove off if your food is beginning to boil over. Add a fire extinguisher in the ceiling and it could selectively put out small fires at the source.

It is a step up to think of speaking to a light from clapping to turn on, along with telling it to dim a bit more. It could even be programmed to automatically dim when a TV is turned on. Easy to think of it as a replacement for the Amazon ECHO. Have a question, ask it out loud and the light will search the net and speak back with an answer.

I can also envision it to be programmed to know when you are away, so the motion detector will know that no one should be there and to turn on, give an audible alarm, and call the police, or just call the police with no alarm. Heck, a video camera could send a pic of the culprit to the police along with the call.

Devices need electricity, whether by battery or from the wall. Light bulbs are always plugged in directly and fixed, so no battery needed.

It would also be easy to take it with you and just change apps so it works in your cabin, camper, or hotel room. Ah, technology, how the mind wanders.

New Way to Slice Pizza

For those who have friends or family who love crust and some who do not, here is a novel way to satisfy both.

Make a few gently curved lines and join corners to middles. Six slices each, with crust and no crust.

Screen Resolution Evolution

Now that the 2016 Consumer Electronic Show has ended, it seems appropriate to recap where we are with TVs and how we got here.

First, 3D TV is dead. Curved screens remain a hard sell. 4K TV is looking at a short life span as it is already being usurped by 8K TV. 8K may suffer the same fate unless TV and movie producers begin to crank out content capable of utilizing the new standards. In times past, we always waited for hardware to catch up to our needs, now we are waiting for content to catch up to hardware.

Sharp released its first 8K TV in 2015. The 85-inch LV-85001 costs $133,000. Samsung showed its 110-inch 8K TV in January, 2016. It also announced that a 11K TV is being developed for the 2018 Pyeongchang Winter Olympics. LG also showed off a 98-inch 8K TV in January, 2016. All of this advancement comes amid a current dearth of 4K content. These advances may still prove to be more resilient than the 3D revolution that never happened.

Advances in hardware and software continue to outrun battery capacity and bandwidth speed. Although bandwidth is less of an issue in Europe and other countries as the US continues to lag, mostly due to politics, not capability.

How we began the race comes from early television. For the first half-century of television, resolution was measured in lines per screen rather than pixels. TV resolution in the 1930s and 1940s had 240 to 819 lines per screen, improving upon previous resolutions. The new resolution used a display method known as progressive scanning, where each line of an image is displayed in sequence, in contrast to the traditional analog method where first odd and then even lines are drawn alternately.

In 1953, analog color TV had 525 lines, establishing the NTSC color standard. Europe followed up in the 1960s by introducing the 625-line standards. However, bandwidth barriers limited widespread adoption of analog HDTV.

In 1977, the Apple II introduced color CRT display to home computers by adapting the NTSC color signal. The Apple II achieved a resolution of 280 pixels horizontally by 192 pixels vertically. By the 1980s, home computer makers began using pixels (picture elements) as a unit of measure.

IBM introduced a VGA standard display of 640x480 in 1987. Since then, demand for digital videos and video games has driven resolution to greater and greater density. Desktop monitors are now a standard resolution of 2560x1600. Mobile devices range lower from 240x320 for the smallest devices.

During the 1990s, plasma TVs and LCD TVs moved toward thinner and lighter TVs. During 1996, digital was officially mandated by the US FCC as a new standard for future DTV/HDTV broadcasting. By 2006, LCDs became more popular due to better daytime viewing and lower prices. LCDs created colored images by selectively blocking and filtering a white LED backlight rather than directly producing light.

HDTV uses a resolution of 1920x1080p, equivalent to 2,073,600 pixels per frame, and known as 1080p. The 4K Ultra HDTV uses 3840x2160p, known as 2160p. This amounts to four times the amount of pixels and twice the resolution of HDTV, hence 4K. The newer 8K increases this eight times to 7680x4320.

OLED improved color by directly producing colored light, allowing for greater contrast. OLED TVs are also extremely thin, measuring in fractions of an inch.

When the iPhone 4 was released, Steve Jobs claimed that the human eye cannot detect smartphone resolution beyond 300 pixels per inch (Apple's limit at the time). However, many others have proven the eye can actually detect at least 900 or greater PPI.

Incidentally, it is the relationship of HD, 4K, 8K, etc., to screen size that makes the difference. Phone screens are small, so HD, 4K, etc., are a waste, as our eyes cannot perceive the difference. Distance between our eyes and the screen is also a factor, that is why many TV manufacturers show the optimal distance for viewing.

As TV sets grow, it takes more pixels to see the same clarity of picture that are needed on a smaller screen. The arguments of not being able to tell the difference between HD, 4K, and 8K are relative to size and distance from the screen. However, 8K is likely beyond the average household to notice any perceptible difference vs. 4K.

Stroma Procedure

There is a laser treatment, pioneered by California-based Stroma Medical and it is currently available in several countries. It is undergoing human testing in Costa Rica that turns brown eyes blue. The Strōma laser disrupts the brown layer of pigment, causing the body to initiate a natural and gradual tissue-removal process. Once the tissue is removed, the patient’s natural blue eye is revealed. The procedure is totally non-invasive and takes about 20 seconds to perform, but takes two to four weeks to see final results. Current cost is about US $5,000. Reminds me of a song by Crystal Gale. LINK             

China's Wealth

China has 190 billionaires, more than two million millionaires, and ranks a bit behind the US in number of high-net-worth individuals, according to research from Forbes magazine and Boston Consulting Group. Not bad for a communist country.

Horology

The science of timekeeping is known as horology.
Nanosecond and Picosecond - A nanosecond is one billionth of a second, and a picosecond is one trillionth or 0.000 000 000 001 of a second.

Planck time - Planck time is the shortest known time span. It is the time it takes for light to travel a Planck length or 1.616199 × 10-35 meters in vacuum.

Easter celebration date - Easter is normally celebrated on the first Sunday after the first full moon that occurs on or after the Spring Equinox.

Light year - A light year is not a unit of time, but a unit of distance. The International Astronomical Union defines a light year as the distance light travels in vacuum in one Julian Year. In astronomy, a Julian Year corresponds to exactly 365.25 days.

Fortnight - A fortnight is a unit of time that refers to 14 days. It comes from an old English word, fēowertȳne niht, meaning fourteen night. It is commonly used in the UK, Ireland, and many commonwealth countries. People in the US and most parts of Canada use the term biweekly to refer to the time period of two weeks.

New York minute - The phrase in a New York minute refers to a very short period of time or an instant. Legend has it that the phrase originated in Texas in the late 1960s. The phrase was popularized by TV personality Johnny Carson who joked that a New York minute was the time between a traffic light turning green and the car behind one's car honking.

Jiffy - Jiffy is usually used to indicate a very short period of time, but it is formally defined in the fields of Physics and Chemistry as the time required for light to travel a centimeter. Also known as a light centimeter, a jiffy is equal to about 33.3564 picoseconds.

Friday 13th - Any month in the Gregorian Calendar that begins on a Sunday will have a Friday, the 13th, and there is at least one Friday the 13th in every year. A single calendar year can have up to 3 Friday the 13ths.

Jan 15, 2016

Happy Friday

Laughter increases the activities of antibodies in the body by twenty percent, helping destroy viruses and tumor cells.

I always increase my laughter while enjoying a Happy Friday!
Looney Tunes wanted to add a rabbit to their lineup and animator Ben "Bugs" Hardaway had a sketch of the proposed bunny. When the drawing was finished, he labeled it as "Bug's Bunny," his nickname and bunny.

Later the studio was looking for a name, saw the caption at the bottom, so just eliminated the apostrophe from bug's and the new name was born.

Physian's Changing Attitudes

Doctors have always encountered the problem of how to best tell their patient of a terminal sentence. Recently, medical professions have been more upfront about tragic news such as this. Physicians used to think that by not telling a person they were dying, it would boost their moral and increase their hope.

During 1961 only 10% believed it was correct to tell a patient of a fatal diagnosis. This changed after studies were done that revealed nearly 90% of patients said they would like to know the truth of their prognosis.

By 1979, physicians had completely reversed their beliefs and a survey revealed that 97% felt full disclosure was the correct course to take.