Last active
January 24, 2017 19:05
-
-
Save khalidabuhakmeh/d8dc859af36c2314fa6e8b1991f1c71d to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[ | |
{ | |
"id" : "1483228800-2017-01-01-test-search-excerpt", | |
"site" : "Test", | |
"title": "Test Search Excerpt", | |
"url": "http://localhost:4000/2017/01/01/test-search-excerpt.html", | |
"categories" : [], | |
"tags" : [], | |
"authors" : [], | |
"publishedDate" : "2017-01-01 00:00:00 +0000", | |
"excerpt" : "This is the search excerpt.", | |
"content" : "This is the post body." | |
}, | |
{ | |
"id" : "1482796800-economy-uncategorized-2016-12-27-this-letter-offers-a-clue-into-how-trump-could-try-to-change-american-business", | |
"site" : "Washington Post Blog", | |
"title": "This letter offers a clue into how Trump could try to change American business", | |
"url": "http://localhost:4000/economy/uncategorized/2016/12/27/this-letter-offers-a-clue-into-how-trump-could-try-to-change-american-business.html", | |
"categories" : ["Economy","Uncategorized"], | |
"tags" : ["Economy","Uncategorized"], | |
"authors" : ["Danielle Paquette"], | |
"publishedDate" : "2016-12-27 00:00:00 +0000", | |
"content" : " Republican President-elect Donald Trump has promised to roll back regulations. (Jae C. Hong/AP) In March, a debate broke out about why Carrier, the air conditioning manufacturer, was planning to move 2,100 jobs from two Indiana factories to Mexico. “This is about Carrier chasing Mexican wages at $3 an hour,” Democratic Sen. Joseph Donnelly (Ind.) charged at the time. That prompted Jim Schellinger, president of the Indiana Economic Development Corporation, to write a letter to correct the record. While Carrier would indeed pay Mexican workers $3 an hour, plus another $3 in benefits, “extensive federal regulations were the leading factor of the decision to relocate 2,100 manufacturing jobs,” he wrote to Donnelly, according to a copy of the letter released as part of a public record requests. Carrier’s plans sparked national attention after President-elect Donald Trump, who had promised to prevent the offshoring, intervened, securing a deal to keep 800 jobs on American soil in exchange for roughly $7 million in state tax incentives. That was a sliver of the $65 million the company projected it would annually save after shuttling jobs south of the border, triggering questions about why Carrier agreed to the deal. The March letter, which was previously reported by Indianapolis television news, may offer a clue. How Trump and Carrier reached an agreement remains murky. Speculation has focused on the company’s desire to please the new president and maintain United Technologies’ government contracts. But in a company statement, Carrier asserted the Trump administration will “create an improved, more competitive U.S. business climate.” Trump, meanwhile, has promised to scale back regulation. “Fifty-three new regulations,” Trump said in a Dec. 1 speech at Carrier’s Indianapolis plant, repeating a figure that Robert McDonough, the chief executive of Carrier parent United Technologies, had used to explain the regulations his company faced. “Massively expensive and probably none of them amount to anything in terms of safety. … Your unnecessary regulations are going to be gone.” (United Technologies declined requests for comment.) Neither Trump nor McDonough identified which regulations they think are stymieing enterprise, but Carrier sent a list of Department of Energy efficiency rules to Schellinger, records show. The Energy Department standards, to be clear, apply to home and commercial appliances sold in the United States. They wouldn’t fall away because a business leaves the country. Francis Dietz, spokesman for the Air Conditioning, Heating and Refrigeration Institute, the trade group that represents Carrier, said the recent flurry of efficiency regulations under President Obama have jacked up overall costs for companies like Carrier. Seeking cheaper labor, he said, is one way to offset that. “The point is: When you have a large number of regulations, that ends up costing you a lot of money, and you start looking for other ways to save money,” Dietz said. “This leads companies to move some of their operations to where the cost to produce them is less.” He produced a list of 52 measures, introduced by the Department of Energy between 2009 and 2016, that seek to alter the way companies make their heating and cooling products — including air conditioners and furnaces. Trump has vowed to kill the EPA’s Clean Power Plan and said he might back out of the Paris climate agreement, moves that dramatically downgrade climate change’s status as a national priority. He has not, however, blasted energy efficiency rules, which save consumers money while also curbing carbon emissions. Obama’s Energy Department has finalized more new standards for energy efficient appliances than any past administration, pushing improvements for 45 different products, such as refrigerators and light bulbs. James Sweeney, director of Stanford University’s Precourt Institute for Energy Efficiency, said if Trump wanted to undo Obama’s rules, he couldn’t knock them out by himself or quickly. Removing them would require producing new proposals that would have to pass economic analyses and any lawsuits. (Regulations finalized after May, however, could be subject to a speedier congressional reversal, thanks to Republican dominance on Capitol Hill.) “He’d have to go through the process in reverse,” Sweeney said. Under Obama’s rules, Americans would save an estimated $540 billion in electric and other bill expenditures through 2030, said Andrew deLaski, executive director of the Appliance Standards Awareness Project. The updates are also expected to annually slash emissions during that period by 210 million tons. Over the last three years, the DOE has released 28 rules that would impact companies in the heating and cooling space, including Carrier. John Hurst, vice president of government affairs and communications at Lennox International, which makes products similar to Carrier’s and also operates factories in Mexico, said the energy efficiency rules eat up his company’s money and time. Developing new products is expensive, he said. Creating an energy-saving air conditioner, for example, is costlier, too — they take up more space and therefore require more copper and steel. “That,” he said, “can reflect an impact on American jobs.” Still, he said, Lennox does not advocate for a rollback of rules they’ve already negotiated with the Energy Department. Hurst said he has met with DOE officials at least 15 times over the past two years and a Carrier representative was present to provide input as well. The company has already made the investments necessary for new products. The manufacturing changes are underway. Regardless of what Trump does, he said, Lennox expects a deceleration of new regulations. “That’s a natural outcome,” he said,“because we’ve negotiated long-term efficiency regulations with other business leaders. That provides regulatory certainty for the next 12 to 15 years, in most cases.” More on Wonkblog: Carrier union leader: Trump ‘lied his a– off” This could be Donald Trump’s first big test Why Trump would struggle to punish companies that offshore jobs" | |
}, | |
{ | |
"id" : "1482796800-health-20care-2016-12-27-the-us-spends-more-on-health-care-than-any-other-country-here-s-what-we-re-buying", | |
"site" : "Washington Post Blog", | |
"title": "The US spends more on health care than any other country Here’s what we’re buying", | |
"url": "http://localhost:4000/health%20care/2016/12/27/the-us-spends-more-on-health-care-than-any-other-country-here-s-what-we-re-buying.html", | |
"categories" : ["Health Care"], | |
"tags" : ["Health Care"], | |
"authors" : ["Carolyn Y. Johnson"], | |
"publishedDate" : "2016-12-27 00:00:00 +0000", | |
"content" : " (Washington Post illustration; iStock) American health-care spending, measured in trillions of dollars, boggles the mind. Last year, we spent $3.2 trillion on health care – a number so large that it can be difficult to grasp its scale. A new study published in the Journal of the American Medical Association reveals what patients and their insurers are spending that money on, breaking it down by 155 diseases, patient age and category – such as pharmaceuticals or hospitalizations. Among its findings: Chronic – and often preventable – diseases are a huge driver of personal health spending. The three most expensive diseases in 2013: diabetes ($101 billion), the most common form of heart disease ($88 billion) and back and neck pain ($88 billion). Yearly spending increases aren’t uniform: Over a nearly two-decade period, diabetes and low back and neck pain grew at more than 6 percent per year – much faster than overall spending. Meanwhile, heart disease spending grew at 0.2 percent. Medical spending increases with age – with the exception of newborns. About 38 percent of personal health spending in 2013 was for people over age 65. Annual spending for girls between 1 and 4 years old averaged $2,000 per person; older women 70 to 74 years old averaged $16,000. The analysis provides some insight into what’s driving one particularly large statistic: Within a decade, close to a fifth of the American economy will consist of health care. “It’s important we have a complete landscape when thinking about ways to make the health care system more efficient,” said Joseph Dieleman, an assistant professor at the Institute for Health Metrics and Evaluation at the University of Washington who led the work. [The alluring idea that we can cure cancer has become a trap] The data show that the primary drivers of health-care spending vary considerably. For example, more than half of diabetes care is spending on drugs, while only about 4 percent of spending on low back and neck pain was on pharmaceuticals. Generally, more spending is done on elderly people, but about 70 percent of the spending on low back and neck pain was on working-age adults. Such insights provide a way to find the drivers of growth in health-care spending and to launch strategies to control it. “Data like this continues to draw attention to the fact a lot of these proposals being discussed about controlling health-care costs really don’t address the underlying issue, which is rising disease prevalence,” said Ken Thorpe, a professor of health policy at Emory University who was not involved in the study but has done similar research. “You see this rise in chronic disease spending – much of it is potentially preventable.” Most of the discussion of health care in America has focused on access to insurance, but the spending breakdown shows that the biggest opportunities may come in preventing disease. [Americans are shouldering more and more of their health-care costs] The researchers also analyzed spending on public health and prevention. In a separate editorial, Ezekiel Emanuel, a former health-care adviser to President Obama, pointed out that the largest public health spending was on HIV. But fewer than 7,000 Americans died because of HIV/AIDS in 2014 and it ranked 75th on the list of diseases by personal health expenditures. “Few public health dollars focus on lifestyle conditions that ultimately contribute to the majority of chronic illnesses seen today,” Emanuel wrote. Low back and neck pain, for example, ranked low on the list of public health expenditures with $140 million in public health funding, but high on the list of health-care spending. Tobacco control received $340 million in public health spending, but smoking contributes to several diseases that drive health-care spending. What the data also show is that conditions that drive health-care spending aren’t necessarily the ones that come to mind when people think about health care. Falls were the fifth-highest cause of health spending, followed closely by depression. Pregnancy and dental care were in the top 15. Read More: Why treating diabetes keeps getting more expensive Why the diseases that cause the most harm don’t always get the most research money Under Obamacare, fewer people skipped doctors’ visits because of cost [Poll shows Obamacare started looking a lot better after the election](https://www.washingtonpost.com/news/wonk/wp/2016/12/01/poll-shows-obamacare-started-looking-a-lot-better-after-the-election/?utm_term=.5ffa2d6e5a78)" | |
}, | |
{ | |
"id" : "1482796800-uncategorized-2016-12-27-the-secret-feminist-history-of-shopping", | |
"site" : "Washington Post Blog", | |
"title": "The secret feminist history of shopping", | |
"url": "http://localhost:4000/uncategorized/2016/12/27/the-secret-feminist-history-of-shopping.html", | |
"categories" : ["Uncategorized"], | |
"tags" : ["Uncategorized"], | |
"authors" : ["Jeff Guo"], | |
"publishedDate" : "2016-12-27 00:00:00 +0000", | |
"content" : " Radical. (Hannah McKay/European Pressphoto Agency) For America’s malls, December was once the happiest time of the year. Now, each holiday season brings a painful reminder that shoppers have increasingly abandoned real-life storefronts for virtual ones. To get people off the couch, mall owners are trying to bring back the idea of shopping as a social activity. They’re investing in free cocoa and “elfie selfie” stations, and they’ve doubled down on the mall Santa, building him expensive high-tech palaces decked out with “Naughty O’ Nice Meters” and “Elf-Ray Vision.” Even stores that have historically shunned these traditions, like Toys ‘R Us, are now getting in the game. It might be too late. The notion of strolling through a physical mall is starting to feel old-fashioned, like barbershop quartets, or writing in cursive. This is how people used to buy things, Virginia, before drone deliveries and the sundry triumphs of on-demand capitalism. But once upon a time, shopping galleries were deeply radical spaces. In fact, it’s impossible to tell the full story of women’s rights without talking about the rise of the mall and its predecessor, the shopping district. These places were crucial to the invention of shopping as an experience: as an act of leisure, as a way to spend an afternoon. And in doing so, they opened up modern cities to women and gave them areas where they, like men, could wander at will. For many middle-class housewives in Victorian England, shopping was their first taste of real freedom, and the starting point for their push into public life, explains historian Erika Diane Rappaport. “During a period in which a family’s respectability and social position depended upon the idea that the middle-class wife and daughter remain apart from the market, politics, and public space, the female shopper was an especially disruptive figure,” she writes in her history “Shopping For Pleasure.” Bazaars and markets are as old as civilization, of course. But the idea of ambling through stores, sipping on cocoa, and admiring (but not necessarily buying) the merchandise — that is a thoroughly modern activity that first gained popularity in the 1800s. And for the time, it was also a minor scandal. As urban centers coalesced in the 19th century, they were primarily the domain of men. Cities were sites of politics and business. Women weren’t entirely excluded, says historian Mica Nava, but their public presence was scarce. They could attend galleries and exhibitions with a male chaperon, for instance; and some shopping did exist, but primarily among wealthy ladies. What changed in the 19th century was industrialization and the manufacturing revolution, which churned out furniture, flatware and clothing in dazzling volumes. The explosion in the variety and availability of affordable consumer goods meant that the growing middle class could suddenly buy things just for the joy of it. And the task of tastefully selecting among these luxury goods fell to the women. Shopping gave middle-class women a foothold in the modern city, and for many, a new pastime. Soon, housewives started roaming the city under the pretense of buying things. By this new definition, “shopping” didn’t always involve an actual purchase. It was about the pleasures of perusing — taking in the sights, the displays, the people. Not everyone was happy about the intrusion of women into urban life. Even in the late 1800s, many still looked down on ladies who walked the streets without a male chaperon. Newspaper columnists condemned their shopping habits as salacious acts of public consumerism. “Perhaps nothing was more revolting than the spectacle of a middle-class woman immersed in the filthy, fraudulent, and dangerous world of the urban marketplace,” Rappaport writes. But urban retailers eagerly welcomed the women. They invented places like the department store, where women could shop comfortably, surrounded by amenities, and in semiprivate. “By providing a reason — shopping — for women to appear unescorted in public, as well as arranging safe spaces like restrooms and tea rooms where women could gather or sit alone without fear of being molested by men … department stores also made it possible for women to leave the domestic space of the home and lay claim to the center of the city,” write sociologists Sharon Zukin and Jennifer Smith Maguire. Slowly, the city reconfigured itself in response to the demands of shopping women. In the London of the early 1800s, suburban women day-trippers often had no place to eat lunch or even use the restroom. But soon, Rappaport writes, feminists were pressuring the city government to install public lavatories. Women’s clubs and tea shops sprang up for women to grab a bite in between their shopping excursions. With these social changes came new social ills. On both sides of the Atlantic, there was an outbreak of shoplifting. But since the perpetrators were typically well-to-do women, they weren’t thrown in jail, explains historian Elaine Abelson. Doctors decided that this was a medical condition related to their uteruses, and invented the disease “kleptomania.” This epidemic of petty, middle-class crime made huge waves in the popular culture, where there were songs and movies about female shoplifters. The act of acquiring things was increasingly seen as its own pleasure, and many women blamed department stores for being temples of temptation. By the early 1900s, London’s shopping scene also became a battleground for the women’s suffrage movement, who went on window-smashing raids against the same stores that relied on their business. The suffragettes took advantage of women’s newfound place in urban life, which allowed them for the first time to move freely in parts of the city. “Suddenly women who had a moment before appeared to be on peaceful shopping expeditions produced from bags or muffs, hammers, stones and sticks, and began an attack upon the nearest windows,” one Daily Telegraph article described, according to Rappaport. These violent efforts eventually helped women in England win the vote in 1918. Now a century later, this world of militant suffragettes and male chaperons sounds like an alien planet.We take for granted a lot of the changes that were set into motion when department stores gave women an excuse to take more and more excursions outside the home. It’s of course sexist that shopping today is still perceived as a “girlie” activity. But at the time, shopping helped women assert themselves and assert their economic importance in a society that denied them a larger role in the public sphere. As Rappaport writes, “For women with few public activities and limited employment and educational options, shopping allowed them to occupy and construct urban space.” (And, daresay, suburban malls served something of the same purpose for the boys and girls of the ’80s and ’90s.) So let’s sidestep all of those French philosophers who have written so scathingly about consumption culture, except to concede that yes, we often buy things because it is fashionable, and yes, we often buy things that we don’t need. So what? Our consumerist habits are not going away. They’re just moving online. What is disappearing is the shopping mall — and with it, the notion of shopping as a social activity. It’s okay to be nostalgic for all that once symbolized." | |
}, | |
{ | |
"id" : "1482796800-blogging-content-20marketing-influencer-20marketing-public-20relations-reputation-20management-2016-12-27-my-best-biznology-blog-posts-of-2016", | |
"site" : "biznology Blog", | |
"title": "My Best Biznology Blog Posts of 2016", | |
"url": "http://localhost:4000/blogging/content%20marketing/influencer%20marketing/public%20relations/reputation%20management/2016/12/27/my-best-biznology-blog-posts-of-2016.html", | |
"categories" : ["Blogging","Content Marketing","Influencer Marketing","Public Relations","Reputation Management"], | |
"tags" : ["Blogging","Content Marketing","Influencer Marketing","Public Relations","Reputation Management"], | |
"authors" : ["Chris Abraham"], | |
"publishedDate" : "2016-12-27 00:00:00 +0000", | |
"content" : "Everyone’s doing clip shows this late into the year. My options were either doing a best of, “here’s what you might have missed,” list of articles, which I have chosen to do; or, I could very well have done a deep prediction of what’s going to be hot in 2017. I think I’ll predict the future right after I attend Renaissance Weekend this week in Charleston, SC. Maybe by the 3rd I’ll know what the future may bring. All I know know right now is that it seems to me like the rest of the industry is slowly catching up with Dan Krueger and me, Chris Abraham. Folks are really starting to understand the value of the micro-influencer when it comes to promoting their event, brand, product, service, or cause. That’s very exciting because it lends a lot of credibility to strategies and tactics we have been successfully and powerfully using over at Gerris Corp since well before it was Gerr.is, way back in 2006 — now over a decade ago! What’s even more exciting is that Earned Media PR and Marketing is still alive and well — you just need to not suck! If you bring it, they will post, share, tweet, and blog you! Kindness, respect, appreciation, good humor, and love can unlock a lot of doors online, don’t you forget it! I hope my exhaustive list of all my favorite blog posts I share with you below allow you to catch up on the articles, posts, and content that you missed in 2016. I look forward to continuing to selflessly and shamelessly share every single little business thing in my head openly and honestly. Instead of black-boxing our entire process and sneaking around in the shadows, we at Gerris prefer to share it all and scare you with the sheer amount of work and discipline it all takes to actually do, hoping you’ll prefer to let us gladly and happily do it instead of simply trying to reinvent our wheel. If you have any questions or want to work with Dan and me, feel free to call or email me. My Best Biznology Blog Posts of 2016 Use Instagram as the starting point to promote your business online The state of the blogger outreach union ten years on Influencer marketing should not be about collecting influencers Take your bloggers and social media influencers to lunch Influencer outreach is more PR than marketing or advertising Becoming a reddit Internet sensation Launching your social campaign with guns blazing! Your team is the best content source for search Help Google help you optimize your content for search Influencer marketing tools of the trade Earned media influencer marketing demands your awesome Earned media influencer outreach is alive and well Kindness opens many doors online If you post it will they come (to read your blog)? Influencer marketing is one percent inspiration, ninety nine percent perspiration It’s normal to be bad before you’re good Remember to market to your own influencer network Punch hard through the entire campaign clear through to the final hours Selling is to hunting like marketing is to trapping Exploit the hours employees spend on social media Micro-influencer marketing is the new long tail Today’s online influencers are wicked tough Search + Social = Online Reputation Management The more the messier for content marketing SEO success What content to blog for Google search success All your best content may now be considered fake news Feed Google fresh sandwiches every day instead of Christmas dinner once-a-year Like this post? Sign up for our emails here. The post My Best Biznology Blog Posts of 2016 appeared first on Biznology." | |
}, | |
{ | |
"id" : "1482710400-economy-politics-2016-12-26-our-national-deja-vu-on-taxing-the-rich", | |
"site" : "Washington Post Blog", | |
"title": "Our national deja vu on taxing the rich", | |
"url": "http://localhost:4000/economy/politics/2016/12/26/our-national-deja-vu-on-taxing-the-rich.html", | |
"categories" : ["Economy","Politics"], | |
"tags" : ["Economy","Politics"], | |
"authors" : ["Jim Tankersley"], | |
"publishedDate" : "2016-12-26 00:00:00 +0000", | |
"content" : " Then-Treasury Secretary Timothy Geithner enters the U.S. Capitol to discuss the impending fiscal cliff in November 2012\\. (Michael Reynolds/European Pressphoto Agency) Four years ago, fresh off what seemed at the time to be an unusually bitter and divided presidential election, lawmakers gathered in late December to do something rare: They cut a deal. The “fiscal cliff” compromise left almost no one happy. It neither supercharged nor tanked the economy. But it did resolve what had been a 12-year argument over how much to tax high-earning Americans. That resolution is about to be overturned, with big implications for, well, how much rich people pay in taxes. The Internal Revenue Service keeps track of the effective tax rate every year for the 400 highest-earning American taxpayers. There’s a clear pattern in its data, which now run through 2014: When Bush and Congress cut top income tax rates and taxes on capital gains in 2001 and 2003, high earners paid a lower share of their income in taxes. After the fiscal cliff deal, that share increased, from a low of about 17 percent to more than 23 percent. You can reasonably anticipate that average tax rate will fall again, under the tax changes that President-elect Donald Trump is pushing. Trump talked a lot about tax cuts on his road to the presidency. He mostly emphasized reductions for the middle class, but his plan would actually deliver its largest benefits by far to the very top sliver of income-earners, in absolute dollars and in percentage terms. (The House Republican plan is even more concentrated in its benefits for the very rich, independent analyses find.) Trump breaks from previous Republican presidents on many economic issues, particularly trade and immigration, but his tax plan includes the same top rate that candidate George W. Bush proposed in 2000. [The very interesting thing that happened when Obama raised rich people’s taxes] Bush didn’t quite reach his campaign goal of a 33 percent top marginal income tax rate; he had to settle for 35 percent. The fiscal cliff deal raised that rate back to nearly 40 percent. In between, Democrats and Republicans sparred repeatedly about that number. The debate in 2012 between President Obama and Mitt Romney, wasn’t really about rolling back the “Bush tax cuts.” Both candidates wanted to keep the cuts that helped low-, middle- and even some high-income earners. The big question was whether to keep the rate low for the very rich. Trump muddied that debate on the trail. His plan quite clearly sides with Bush, Romney and lower rates on top earners. His rhetoric, at times, suggested otherwise — and has continued to since his election, with his Treasury pick, Steven Mnuchin, suggesting that Trump will insist top earners receive no net tax cut in any deal. Many economists consider that pledge impossible, because there aren’t enough loopholes to close, to offset the gains from rate cuts that Trump is proposing for high earners. The likelier scenario is that Trump and the congressional GOP will pass a plan that reduces taxes on the rich, in hopes of spurring faster economic growth. Democrats will object; campaign lines will be drawn. The issue could loom large in the midterm congressional elections, and in 2020. The fiscal cliff will prove a historical footnote. The battle over top rates will drag into another decade, and maybe beyond. More from Wonkblog: Under Trump, red states are finally going to be able to turn themselves into poor, unhealthy paradises As the rich become super-rich, they pay lower taxes. For real. Donald Trump’s tax plan now favors the ultra-rich even more" | |
}, | |
{ | |
"id" : "1482537600-data-20visualization-health-20care-2016-12-24-where-the-heaviestdrinking-americans-live", | |
"site" : "Washington Post Blog", | |
"title": "Where the heaviestdrinking Americans live", | |
"url": "http://localhost:4000/data%20visualization/health%20care/2016/12/24/where-the-heaviestdrinking-americans-live.html", | |
"categories" : ["Data visualization","Health Care"], | |
"tags" : ["Data visualization","Health Care"], | |
"authors" : ["Christopher Ingraham"], | |
"publishedDate" : "2016-12-24 00:00:00 +0000", | |
"content" : "Mission Accomplished. Got everything done. Now I get to binge drink through New Year’s. — Comfortably Smug (@ComfortablySmug) December 22, 2016 Twitter user “Comfortably Smug” tidily sums up the holiday sentiments of many of us as we wrap up 2016’s loose ends before checking out until the new year. While we here at Wonkblog don’t recommend that you “binge drink through New Year’s,” there’s no doubt that the holidays have traditionally been a time for boozing it up. Take a gander, for instance, at the total monthly alcohol sales in the United States. If you squint really hard you may detect a seasonal trend — those spikes are December of each year. We’re not just buying booze during the holidays, of course — we’re guzzling it down, too. Various direct and indirect measures of alcohol consumption, including breathalyzer data, Web searches for hangover relief and alcohol-related traffic deaths all suggest that peak American drinking happens between Thanksgiving and New Year’s. But who among us is likely to do the most drinking this holiday season? The Department of Health and Human Services recently updated the official federal statistics on the percent of state residents ages 12 and older who drink at least once a month. Here’s a map of how those figures break down by state for the years 2014 and 2015. New England is home to the nation’s heaviest drinkers — New Hampshire, where about 64 percent of residents age of 12 or older drink monthly, is tops in the country. Vermont, Maine and Connecticut also come in at drinking rates above 60 percent. Hard-drinking cheeseheads in Wisconsin see to it that their home is the only Midwestern state in the top tier of American drinkers. The next tier of heavy drinking states are all in the northern part of the country. Some researchers posit that there may be a relationship between heavy drinking and latitude — at the country level, alcohol consumption tends to increase the farther you get away from the equator. This could be a function of the potential for boredom and depression during winter months when the nights are long, the days are short, and baby it’s cold outside — for a prime example of this, see recent stories involving alcohol and misconduct among people who live in Antarctica. But other cultural factors can attenuate this relationship. On the map above, take a look at Utah and particularly Idaho. They’re in the bottom tier of the states for drinking frequency. Utah, where only 31 percent of adults drink in a given month, comes in dead last. This is almost certainly because of the large Mormon populations in those states — 58 percent of Utahans are Mormon, as are 24 percent of people in Idaho. Mormonism generally prohibits the use of alcohol and other drugs. There’s likely a similar religious influence in places Alabama, Mississippi and the other Southern states where drinking is low. Those states have large evangelical Christian populations, many of whom are abstainers. One interesting thing about American drinking rates is how little they change over time. In all of the United States, the past month drinking rate in 2014/2015 (52 percent) is essentially unchanged from the rate in 2008/2009. Public health folks typically talk about drinking in terms of how bad it is for you — how we drink too much and don’t tax alcohol enough, and how it’s basically killing us. But in the spirit of holiday cheer, I’ll close with a reminder that the main reason people drink is because it’s fun, as one group of scientists finally discovered in 2016. In most cases, pouring yourself a cold one is associated with roughly a 4 percent boost to your happiness. I’ve included a table of the state-level drinking rates below. Enjoy your data responsibly. State Monthly drinking rate, 2008-2009 Monthly drinking rate, 2014-2015 Alabama 42.94 43.94 Alaska 54.52 54.98 Arizona 51.15 51.19 Arkansas 43.29 41.81 California 50.65 51.54 Colorado 62.22 59.22 Connecticut 59.32 60.33 Delaware 56.32 53.85 Florida 51.41 54.83 Georgia 48.89 48.9 Hawaii 48.23 46.6 Idaho 45.71 44.54 Illinois 54.33 54.25 Indiana 48.06 50.44 Iowa 56.7 56.01 Kansas 53.92 52.55 Kentucky 38.74 42.63 Louisiana 48.24 50.03 Maine 55.07 60.41 Maryland 54.24 58.38 Massachusetts 61.54 57.91 Michigan 54.65 53.89 Minnesota 60.95 59.4 Mississippi 40.39 39.52 Missouri 50.15 50.78 Montana 59.07 57.93 Nebraska 53.59 57.43 Nevada 54.47 52.81 New Hampshire 63.98 63.63 New Jersey 54.18 56.67 New Mexico 47.59 47.02 New York 55.72 54.6 North Carolina 47.45 45.84 North Dakota 57.8 58.79 Ohio 51.67 52.56 Oklahoma 45.62 46.04 Oregon 59.77 58.85 Pennsylvania 55.06 56.56 Rhode Island 59.91 59.37 South Carolina 45.89 46.17 South Dakota 57.81 58.02 Tennessee 41.06 43.44 Texas 48.42 47.9 Utah 28.11 31.31 Vermont 60.28 60.67 Virginia 50.79 51.77 Washington 56.13 57.26 West Virginia 38.17 41.28 Wisconsin 61.78 60.36 Wyoming 54.21 54.32" | |
}, | |
{ | |
"id" : "1482451200-reputation-20management-social-20business-social-20media-20marketing-social-20media-pr-uncategorized-2016-12-23-now-is-the-time-to-fix-your-linkedin-profile-here-s-why", | |
"site" : "biznology Blog", | |
"title": "Now is the time to fix your LinkedIn profile—here’s why", | |
"url": "http://localhost:4000/reputation%20management/social%20business/social%20media%20marketing/social%20media/pr/uncategorized/2016/12/23/now-is-the-time-to-fix-your-linkedin-profile-here-s-why.html", | |
"categories" : ["Reputation Management","Social Business","Social Media Marketing","Social Media/PR","Uncategorized"], | |
"tags" : ["Reputation Management","Social Business","Social Media Marketing","Social Media/PR","Uncategorized"], | |
"authors" : ["Susan Tatum"], | |
"publishedDate" : "2016-12-23 00:00:00 +0000", | |
"content" : "Have you been meaning to update your LinkedIn profile and just haven’t gotten around to it? You’re not alone, and I hope this article will lead you to rethink that position. That’s because the seemingly harmless act of ignoring your profile can cost you, especially in the area of new business opportunities. Read on to see what happened to three professionals who didn’t update their profiles. An inaccurate profile Barry is a partner in a Tier 1 accounting firm. He’s been there nearly two years. He hasn’t bothered to update his LinkedIn profile since he joined the firm. It just doesn’t seem important. When potential new clients check him out on LinkedIn (as 59.9% of them will do), they become confused. Barry says he’s a partner at Company A, but LinkedIn says he works at Company B. Which is true? Doubt = risk = lost opportunity. An incomplete profile Rebecca is a partner at a top 500 law firm. Her LinkedIn profile is incomplete. No picture. No details beyond a list of jobs. She knows she needs to do something about it but she’s working her butt off and can’t give it the time. Meanwhile, Austin is GC at a pharmaceutical company. He saw Rebecca speak a few years ago at a conference. He remembers enough to find her LinkedIn profile, but it’s practically blank. Although she works at a firm with a stellar reputation, he finds other attorneys with Rebecca’s experience who share their backgrounds. He’s more comfortable contacting them. He never gives Rebecca a second thought. An invisible profile Roger is a transfer pricing economist in New York City. There are 183 attorneys in New York on LinkedIn who might need his services. But Roger’s profile does not show up on a LinkedIn search. Instead, anyone looking will find 21 other people who can do the job. Roger never even gets a first thought. Inaccuracy, incompleteness, and no search optimization. How much new business are you missing because of any of these? The LinkedIn profile has become an essential tool for getting new business. Not too long ago, Barry, Rebecca and Roger might have been able to ignore LinkedIn without too much ill effect. Client recommendations, a few speaking engagements, and publishing some articles were enough to meet most new business needs. But the world has changed, and now more than 30% of professional services buyers find providers with an online search. In today’s market, even client-recommended future clients are checking out references on social media (source). If you’re not putting yourself forward in the best possible light, providing relevant details, and allowing potential new clients to get to know you before committing to a phone call, you’re putting yourself at a great disadvantage. 4 reasons why buyers are making LinkedIn a key part of their decision-making process Multiple studies have included questions to determine why buyers rely on social media in general, LinkedIn especially, to help them find solutions to problems and make buying decisions. The answers tend to fall into the following categories: Access: Social media provides buyers access to a much broader network of peers and experts than they can otherwise tap into. Think about it: let’s say you’re in the US and you need a consultant to help you determine why your employee churn is so high and to fix the leak. By running a quick search on LinkedIn you can identify a large portion of management consultants with experience in handling employee retention issues. From there you can reach out to any one of them for help. Where else can you do that? Trust: 58% of buyers say they go to social media to learn from trustworthy peers and experts. Buyers feel more confident they can identify true experts and get trustworthy information, especially on LinkedIn. I believe this is at least partially due to the transparency of the network. It’s much more difficult to claim false experience or accomplishments when those claims are easily viewed and disputed by people who know the truth. Efficiency: There once was a time when buyers had to shoot out email messages and call around to find out who might be able to help them. Then they had to contact each person individually to ask about solutions. Sometimes they had to go to conferences just to ask their questions. All this before even being sure of what they were looking for. Today, with the use of social media, buyers can connect with experts, get answers, and vet solutions quickly and easily from the comfort of their office, home or even the kids’ soccer match. Relevance: Buyers also like social media because it provides context in which to connect with providers. This is like attending an offline networking event or conference, except they don’t have to go anywhere. This is especially true of LinkedIn, which is a business-to-business networking site uncluttered with cat videos or other distractions. So give them what they want. Access. Trust. Efficiency. Relevance. These are all achievable starting with a complete, accurate, and polished profile. Here’s how you can get started now: Sometimes the hardest part is the first step. Go to LinkedIn and look at your profile. Does it need to be updated? Do it now. If you’re the DIY type, you can download a free copy of our LinkedIn Profile Guide to help you create a more powerful profile (no form required). Like this post? Sign up for our emails here. The post Now is the time to fix your LinkedIn profile—here’s why appeared first on Biznology." | |
}, | |
{ | |
"id" : "1482451200-health-20care-2016-12-23-it-s-not-the-weather-that-makes-christmas-so-deadly", | |
"site" : "Washington Post Blog", | |
"title": "It’s not the weather that makes Christmas so deadly", | |
"url": "http://localhost:4000/health%20care/2016/12/23/it-s-not-the-weather-that-makes-christmas-so-deadly.html", | |
"categories" : ["Health Care"], | |
"tags" : ["Health Care"], | |
"authors" : ["Carolyn Y. Johnson"], | |
"publishedDate" : "2016-12-23 00:00:00 +0000", | |
"content" : " iStock Every winter, deaths from heart-related conditions rise in the United States. Plotted on a graph, the rise in deaths looks like a hill — with two spikes at the top when deaths sharply increase. “Sticking up out of this hill, are two — you might say — television towers. One television tower at Christmas and one television tower at New Year’s,” said David Phillips, a sociologist at the University of California, San Diego, who first pointed out the phenomenon now nicknamed the “Merry Christmas Coronary,” the “Happy New Year Heart Attack,” or the more reserved “Christmas Holiday Effect.” Now, a group of researchers has further ruled out any notion that the bump in holiday deaths could be accounted for by the overall wintertime effect. In a study published in the Journal of the American Heart Association, researchers examined data from New Zealand, where the holidays fall in the middle of summer. They found that the number of cardiac-related deaths outside of hospitals rose 4 percent over the Christmas holiday — resulting in about four additional deaths per year in the small country. The study also found that people who died from heart-related causes during Christmastime were slightly younger than those who died of similar causes during the rest of the year.  Journal of the American Heart Association No one knows exactly why this uptick happens, but Philip Clarke, an economist at the University of Melbourne who oversaw the new study, said that it’s possible that people are traveling over Christmas and may not be as familiar with medical facilities. They may forego care. It’s also possible that a change in diet or stress levels plays a role. Medical facilities may be less well-staffed. Phillips said that his own work has shown similar mortality spikes among specific religious and ethnic groups just after important holidays, which could support the idea that postponement of care is playing a role in the rise in deaths, rather than a shift in hospital staffing. He’s also seen a slight decline in deaths before the holiday, suggesting that sick people could be holding on for the occasion. “The most plausible candidate is a degradation of medical care,” he said, whether it is from patients being less likely to seek medical care or staff being less experienced and skilled, or simply fewer of them. “You could say to patients: Don’t ignore warning signs.” The findings reinforce the pattern initially observed in the United States years ago, and they might raise the question about why anyone would bother to study this in the first place, since it’s already been shown. Researchers redo studies using different data sources, such as the New Zealand population, for an important reason: to see if the findings hold up. This process is called replication, and it’s essential to science and medicine. That’s because some study findings may be due to chance or even publication bias, when results showing a positive result get published and ones that don’t seem to show anything get shoved in a file drawer. Replication offers a powerful way to check whether a result is more universal, holding up beyond the particular situation described in a single study. “What we were intrigued with is that you could do a study in the southern hemisphere, where you get the holiday effect — but it’s obviously a very different season, the height of summer rather than winter,” Clarke said. “Reproduce the results, but in a very different context.” Read More: The deadliest day of the year is almost upon us The sobering thing doctors do when they die Under Obamacare, fewer people skipped doctors’ visits because of cost" | |
}, | |
{ | |
"id" : "1482451200-cost-20and-20quality-insurance-medicaid-medicare-syndicate-the-20health-20law-cadillac-20tax-repeal-20and-20replace-2016-12-23-if-republicans-repeal-health-law-how-will-they-pay-for-replacement", | |
"site" : "Kaiser Health News Blog", | |
"title": "If Republicans Repeal Health Law How Will They Pay For Replacement", | |
"url": "http://localhost:4000/cost%20and%20quality/insurance/medicaid/medicare/syndicate/the%20health%20law/cadillac%20tax/repeal%20and%20replace/2016/12/23/if-republicans-repeal-health-law-how-will-they-pay-for-replacement.html", | |
"categories" : ["Cost and Quality","Insurance","Medicaid","Medicare","Syndicate","The Health Law","Cadillac Tax","Repeal and Replace"], | |
"tags" : ["Cost and Quality","Insurance","Medicaid","Medicare","Syndicate","The Health Law","Cadillac Tax","Repeal and Replace"], | |
"authors" : ["Julie Rovner"], | |
"publishedDate" : "2016-12-23 00:00:00 +0000", | |
"content" : "Leading Republicans have vowed that even if they repeal most of the Affordable Care Act early in 2017, a replacement will not hurt those currently receiving benefits. Republicans will seek to ensure that “no one is worse off,” said House Speaker Paul Ryan, R-Wis., in an interview with a Wisconsin newspaper earlier this month. “The purpose here is to bring relief to people who are suffering from Obamacare so that they can get something better.” But that may be difficult for one big reason — Republicans have also pledged to repeal the taxes that Democrats used to pay for their health law. Without that funding, Republicans will have far less money to spend on whatever they opt for as a replacement. “It will be hard to have comparable coverage if they start with less money,” Gail Wilensky, a health economist who ran the Medicare and Medicaid programs under President George H.W. Bush, said in an interview. “Repealing all the ACA’s taxes as part of repeal and delay only makes a true replacement harder,” wrote Loren Adler and Paul Ginsburg of the Brookings Institution in a white paper out this week. It “would make it much more difficult to achieve a sustainable replacement plan that provides meaningful coverage without increasing deficits.” The health law’s subsidies to individuals buying insurance and the Medicaid expansion are funded by two big pots of money. This KHN story also ran on [NPR](http://www.npr.org/sections/news/). It can be republished for free ([details](/syndication)). [](http://www.npr.org/sections/news/) The first is a series of taxes, including levies on individuals with incomes greater than $200,000, health insurers, makers of medical devices, brand-name drugmakers, people who use tanning salons, and employer plans that are so generous they trigger the much-maligned “Cadillac Tax.” Some of those measures have not yet taken effect. However, the Congressional Budget Office estimated in early 2016 that repealing those provisions would reduce taxes by an estimated $1 trillion over the decade from 2016-2025. The other big pot of money that funds the benefits in the health law comes from reductions in federal spending for Medicare (and to a lesser extent, Medicaid). Those include trims in the scheduled payments to hospitals, insurance companies and other health care providers, as well as increased premiums for higher-income Medicare beneficiaries. CBO estimated in 2015 that cancelling the cuts would boost federal spending by $879 billion from 2016 to 2025. The GOP, in the partial repeal bill that passed in January and was vetoed by President Barack Obama, proposed to cancel the tax increases in the health law, as well as the health premium subsidies and Medicaid expansion. But it would have kept the Medicare and Medicaid payment reductions. Because the benefits that would be repealed cost more than the revenue being lost through the repeal of the taxes, the result would have been net savings to the federal government — to the tune of about $317.5 billion over 10 years, said CBO. But those savings — even if Republicans could find a way to apply them to a new bill — would not be enough to fund the broad expansion of coverage offered under the ACA. If Republicans follow that playbook again, their plans for replacement could be hampered because they will still lose access to tax revenues. That means they cannot fund equivalent benefits unless they find some other source of revenue. Some analysts fear those dollars may come from still more cuts to Medicare and Medicaid. “Medicare and Medicaid face fundamental threats, perhaps the most since they were established in the 1960s,” said Edwin Park of the liberal Center on Budget and Policy Priorities, in a webinar last week. Republicans in the House, however, have identified one other potential source of funding. “Our plan caps the open-ended tax break on employer-based premiums,” said their proposal, called “A Better Way.” House Republicans say that would be preferable to the Cadillac Tax in the ACA, which is scheduled to go into effect in 2020 and taxes only the most generous plans. But health policy analysts say ending the employer tax break could be even more controversial. Capping the amount of health benefits that workers can accept tax-free “would reduce incentives for employers to continue to offer coverage,” said Georgetown University’s Sabrina Corlette. James Klein, president of the American Benefits Council, which represents large employers, said they would look on such a proposal as potentially more damaging to the future of employer-provided insurance than the Cadillac Tax, which his group has lobbied hard against. “This is not a time one wants to disrupt the employer marketplace,” said Klein in an interview. “It seems perplexing to think that if the ACA is going to be repealed, either in large part or altogether, it would be succeeded by a proposal imposing a tax on people who get health coverage from their employer.” Wilensky said that as an economist, getting rid of the tax exclusion for employer-provided health insurance would put her “and all the other economists in seventh heaven.” Economists have argued for years that having the tax code favor benefits over cash wages encourages overly generous insurance and overuse of health services. But at the same time, she added, “I am painfully aware of how unpopular my most favored change would be.” Republicans will have one other option if and when they try to replace the ACA’s benefits — not paying for them at all, thus adding to the federal deficit. While that sounds unlikely for a party dedicated to fiscal responsibility, it wouldn’t be unprecedented. In 2003 the huge Medicare prescription drug law was passed by a Republican Congress — with no specified funding to pay for the benefits." | |
}, | |
{ | |
"id" : "1482451200-economy-2016-12-23-donald-trump-really-might-start-a-trade-war", | |
"site" : "Washington Post Blog", | |
"title": "Donald Trump really might start a trade war", | |
"url": "http://localhost:4000/economy/2016/12/23/donald-trump-really-might-start-a-trade-war.html", | |
"categories" : ["Economy"], | |
"tags" : ["Economy"], | |
"authors" : ["Matt O'Brien"], | |
"publishedDate" : "2016-12-23 00:00:00 +0000", | |
"content" : " (AP Photo/Matt York, File) Maybe we should take Donald Trump seriously and literally when he says he’ll start a trade war with China. That, after all, is what slapping a tariff on Chinese imports would mean. Just ask Beijing. They’ve already started to think about how they would retaliate if he does do it. Not that markets have noticed. They’re blissfully unaware on the way to Dow 20,000, and what they hope is Trump’s tax cut nirvana. Indeed, they seem to believe that Trump’s tough line on China was either just a campaign ploy to help him in the Rust Belt, or else a proposal that has no chance of being passed by a Republican Congress. This is naïve. First off, trade is one of the only two issues Trump really seems to care about, and has had a consistent position on. (The other is his open admiration for authoritarian leaders, which dates back to his 1990 praise for the Tiananmen Square massacre—yes, the massacre itself). He’s always thought that other countries were taking advantage of us, whether that was Japan in 1988, or Mexico in 1998, or China in 2008, and that the only way to stop this was to rip up our allegedly lopsided trade deals and start threatening to put tariffs up unless we got better ones. He talked about it a lot during the campaign. He means it. If this wasn’t clear before, it should be now that Trump has tapped economist Peter Navarro to lead the newly-created National Trade Council. Navarro has had a long and not-so-varied career of worrying about trade and war with China in books like, well, ”Death By China,” “The Coming China Wars,” and “Crouching Tiger: What China’s Militarism Means for the Rest of the World.” Indeed, he co-authored a white paper for the Trump campaign arguing that—this might sound familiar—we should use tariffs as a “negotiating tool” to force other countries, as he’s said before, to stop using export subsidies, stop manipulating their currencies down, stop stealing our intellectual property, and stop using cost-competitive but pollution-heavy factories. The irony is there was actually a pretty good case for doing something like this when Navarro proposed it six years ago, but not anymore. That’s because China is already doing the things a tariff would supposedly make it do. It has eliminated its export subsidies, spent a trillion dollars propping its currency up instead of pushing it down, and is taking real steps to make its air breathable again. But that doesn’t seem to matter much to Trump. He views everything, even a voluntary exchange like trade, as a deal where there’s a winner and a loser. In this case, the country that sells more than it buys abroad is the victor, and the other is the vanquished. Trump, then, thinks that a trade deficit itself, more than, say, an undervalued currency, is proof enough that something must be done. And he doesn’t need Congress to help him do it, either. He can impose tariffs on his own using executive authority under existing laws. There are already whispers he might do just that to put a 10 percent tariff on all imports. Don’t think you weren’t warned. You were, over, and over, and over again. Maybe we should listen?" | |
}, | |
{ | |
"id" : "1482451200-medicare-public-20health-2016-12-23-as-cardiac-rehab-efforts-succeed-medicare-offers-hospitals-incentives-to-expand-programs", | |
"site" : "Kaiser Health News Blog", | |
"title": "As Cardiac Rehab Efforts Succeed Medicare Offers Hospitals Incentives To Expand Programs", | |
"url": "http://localhost:4000/medicare/public%20health/2016/12/23/as-cardiac-rehab-efforts-succeed-medicare-offers-hospitals-incentives-to-expand-programs.html", | |
"categories" : ["Medicare","Public Health"], | |
"tags" : ["Medicare","Public Health"], | |
"authors" : [], | |
"publishedDate" : "2016-12-23 00:00:00 +0000", | |
"content" : "" | |
}, | |
{ | |
"id" : "1482364800-uncategorized-2016-12-22-why-the-rate-of-americans-on-probation-has-plummeted-to-a-20year-low", | |
"site" : "Washington Post Blog", | |
"title": "Why the rate of Americans on probation has plummeted to a 20year low", | |
"url": "http://localhost:4000/uncategorized/2016/12/22/why-the-rate-of-americans-on-probation-has-plummeted-to-a-20year-low.html", | |
"categories" : ["Uncategorized"], | |
"tags" : ["Uncategorized"], | |
"authors" : ["Keith Humphreys"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : " (iStock) Probation is the most expansive component of the U.S. correctional system, overseeing more criminal offenders than all jails and prisons combined. During the height of the U.S. crime wave, from the mid-1970s to the mid-1990s, the number of probationers grew by thousands every month, many of whom eventually ended up serving prison terms for more serious crimes. It can therefore only be good news that a Bureau of Justice Statistics report released yesterday shows that the rate of Americans being on probation is at a more than 20-year low. The number of probationers dropped by 78,700 in 2015 — about 2 percent — which is roughly equivalent to the population of Gary, Ind. This latest annual decline continues a trend that began almost a decade ago and has brought the rate of probation supervision down to 1,522 per 100,000 adults, its lowest point since early 1994. Three factors likely contributed to this generational low in the proportion of Americans who are on probation. First, the crime rate has plummeted in recent decades, which obviously translates into fewer people becoming involved in the correctional system. Second, innovative probation models using “swift, certain and fair” behavior-change strategies have promoted rehabilitation and helped more offenders successfully exit the criminal justice system. Third, declining caseloads allow probation departments to devote more resources to each offender, creating a virtuous cycle in which successful probation feeds further success throughout the community supervision system. Keith Humphreys is a professor of psychiatry at Stanford University. More from Wonkblog: Poor white kids are less likely to go to prison than rich black kids Young people are committing much less crime. Older people are still behaving badly. The U.S. has more jails than colleges. Here’s a map of where those prisoners live" | |
}, | |
{ | |
"id" : "1482364800-gender-uncategorized-2016-12-22-the-days-of-stayathome-moms-are-long-gone-data-show", | |
"site" : "Washington Post Blog", | |
"title": "The days of stayathome moms are ‘long gone’ data show", | |
"url": "http://localhost:4000/gender/uncategorized/2016/12/22/the-days-of-stayathome-moms-are-long-gone-data-show.html", | |
"categories" : ["Gender","Uncategorized"], | |
"tags" : ["Gender","Uncategorized"], | |
"authors" : ["Danielle Paquette"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : " (Amy Cavenaile/The Washington Post; iStock) The days of stay-at-home mothers are behind us, asserts a new report from the Center for American Progress, which analyzed national labor data and found that, across the country, the share of moms who financially support their families continues to grow. Nearly two-thirds of American moms these days (64.4 percent) are breadwinners, the researchers found. That’s a hop from 63.3 percent in 2012, the year of the last analysis, and a leap from 1970, when roughly a quarter could claim the title. “Long gone are the days when the majority of middle- and upper-income women stayed home to raise families full time,” the authors wrote. “Instead, in most families, either both parents work or the household is headed by a single parent.” Forty-two percent of mothers in the United States solely or mostly pull the wagon, while 22.4 bring home at least a quarter of household earnings.  Courtesy of the Center for American Progress It’s important to note, though, that many women still opt out of employment after having kids, desiring to be the primary nurturing force in their children’s lives. Sometimes, however, the soaring cost of child care or a lack of paid maternity leave knocks them out of the workforce. Much of the time, mothers work because they have to work. One middle-class income can no longer support most households, and culture has shifted away from the rigid gender roles of generations past. But public policy hasn’t caught up, argues Sarah Jane Glynn, senior policy adviser at the Center for American Progress, a left-leaning think tank in the nation’s capital. “The fact that women are bringing home a significant portion of their families’ incomes does not mean that there is gender parity in the workforce, nor does it mean that working parents and caregivers have the supports they so vitally need,” she wrote. “A lack of policies such as universal paid family and medical leave, paid sick days and workplace flexibility still hold women back from reaching their full economic potential.” In less Washington terms, Glynn means an employer can withhold a day of pay if a worker misses a shift to take care of a sick baby. Because women shoulder a disproportionate amount of domestic responsibilities, this kind of lost earnings more strongly hits them. Meanwhile, with the United States being the only advanced economy in the world that does not guarantee any paid parental leave, income interruptions frequently follow the birth of a child. (Roughly 43 million American workers have no paid sick leave or parental leave, according to the White House.) The economic blow hits harder mothers of color, whose families are more likely to depend on them for income than white mothers’ families are. Seventy-one percent of black mothers and 41 percent of Hispanic mothers were primary or sole breadwinners in 2015, the most recent data available, compared with 37 percent of white mothers, the CAP paper shows.  Courtesy of the Center for American Progress During the campaign, both presidential candidates pledged to make life easier for working mothers. President-elect Donald Trump was the first Republican contender to release policy plans on paid maternity leave and cheaper child care. (He has not said whether he will prioritize these efforts during his first months in office.) Trump has proposed allowing parents to deduct the average cost of child carein their area from their taxes and creating a national maternity leave program, which, his team said, would pay birth mothers an average of $300 in weekly benefits for up to six weeks. Proponents have called the measures a step in the right direction, considering that no existing national policies provide concentrated support for Americans who juggle both work and kids. Detractors, however, say singling mothers out in the law could hurt their workforce progress. “You’re going to create a scenario where employers have even more incentive to view workers differently, whether it’s conscious or not, and lead to discrimination of women of childbearing age,” said Glynn, who supports implementing a leave program for both mothers and fathers. “It also contributes to the idea that it’s a woman’s job to take care of a baby, while dads are just sperm donors. But fathers of young kids are often desperate for more time with them.” More on Wonkblog: The stark disparities of paid sick leave Ivanka Trump champions working moms — except the ones who design her clothes Why men fear paternity leave" | |
}, | |
{ | |
"id" : "1482364800-aging-cost-20and-20quality-medicare-navigating-20aging-syndicate-chronic-20disease-20care-2016-12-22-new-medicare-rules-should-help-high-need-patients-get-better-treatment", | |
"site" : "Kaiser Health News Blog", | |
"title": "New Medicare Rules Should Help ‘High Need’ Patients Get Better Treatment", | |
"url": "http://localhost:4000/aging/cost%20and%20quality/medicare/navigating%20aging/syndicate/chronic%20disease%20care/2016/12/22/new-medicare-rules-should-help-high-need-patients-get-better-treatment.html", | |
"categories" : ["Aging","Cost and Quality","Medicare","Navigating Aging","Syndicate","Chronic Disease Care"], | |
"tags" : ["Aging","Cost and Quality","Medicare","Navigating Aging","Syndicate","Chronic Disease Care"], | |
"authors" : ["Judith Graham"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : "Doctors have complained for years that they’re not paid adequately for time-consuming work associated with managing care for seriously ill older patients: consulting with other specialists, talking to families and caregivers, interacting with pharmacists and more. That will change on Jan. 1, as a new set of Medicare regulations go into effect. Under the new rules, physicians will be compensated for legwork involved in working in teams — including nurses, social workers and psychiatrists — to improve care for seniors with illnesses such as diabetes, heart failure and hypertension. Care coordination for these “high need” patients will be rewarded, as will efforts to ensure that seniors receive effective treatments for conditions such as anxiety or depression. Comprehensive evaluations of older adults with suspected cognitive impairment will get a lift from new payments tied to the standards that physicians now will be required to follow. NAVIGATING AGING [Navigating Aging](http://khn.org/topics/navigating-aging/) focuses on medical issues and advice associated with aging and end-of-life care, helping America’s 45 million seniors and their families navigate the health care system. To contact Judith with a question or comment, [click here.](http://khn.org/columnists) For more KHN coverage of aging, [click here.](http://khn.org/topics/aging/) The new Medicare policies reflect heightened attention to the costliest patients in the health care system — mostly older adults who have multiple chronic conditions that put them at risk of disability, hospitalization, and an earlier-than-expected death. Altogether, 10 percent of patients account for 65 percent of the nation’s health spending. It remains to be seen how many physicians will embrace the services that the government will now reimburse. Organizations that advocated for the new payment policies hope they’ll make primary care and geriatrics more attractive areas of practice in the years ahead. Here’s a look at what is entailed: Complex Chronic Care Management Two years ago, Medicare began paying nurses, social workers and medical assistants to coordinate care for seniors with two or more serious chronic conditions. But low reimbursement and burdensome requirements discouraged most medical practices from taking this on. New payments for “complex chronic care management” are more generous (an average $93.67 for the first hour, $47.01 for each half hour thereafter) and can be billed more often, making them more attractive. They’ll cover services such as managing seniors’ transitions from the hospital back home or to a rehabilitation center, coordinating home-based services, connecting patients with resources, and educating caregivers about their conditions. Many practices will be able to hire care managers with this new financial support, said Dr. Peter Hollmann, secretary of the American Geriatrics Society and chief medical officer of University Medicine, a medical group practice associated with Brown University’s medical school. To illustrate the benefits, he tells of a recent patient, with diabetes, hypertension and heart failure who was retaining fluid and had poorly controlled blood sugar. After a care manager began calling the 72-year-old man every few days, asking if he was checking his blood sugar or gaining weight, Hoffmann adjusted doses of insulin and diuretics. “The patient remained at home and he’s doing well, and we likely prevented a hospitalization,” Hoffmann said. Cognitive Impairment Assessment Making a dementia diagnosis is difficult, and primary care physicians often fail to do so on a timely basis. But new Medicare policies may help change that by specifying what cognitive examinations should entail and offering enhanced payments. Physicians who conduct these evaluations are now expected to meet 10 requirements. In addition to performing a careful physical exam and taking a detailed history, they need to assess an older adult’s ability to perform activities of daily living, their safety, behavioral and neuropsychiatric symptoms, and caregivers’ knowledge, needs and abilities. Use Our Content This KHN story can be republished for free ([details](/syndication)). All the medications the senior is taking should be evaluated, and standardized tests used to assess cognition. Efforts to elicit the patient’s goals and values need to occur in the context of advance planning, and a care plan must be crafted and shared with caregivers. Medicare will pay $238.30 for the initial assessment and additional fees for creating a care plan and performing care management. “Hopefully, this will kick start the development of practices that provide these dementia-related services,” said Dr. Robert Zorowitz, senior medical director at OptumCare CarePlus, a managed Medicare long-term care program in New York City. Care Between Patient Visits Until now, the rule has been: if the doctor is with a patient, he can bill for his time. But if he takes home medical records to review at night or talks by phone with a caregiver who’s concerned about her elderly mother, that time goes unpaid. That will change next year: Medicare will begin paying $113.41 for the first hour spent in these kind of activities and $54.55 for every subsequent half hour. For the first time, “this recognizes the significant and valuable services that physicians perform in between face-to-face visits,” said Dr. Phillip Rodgers, co-chair of the public policy committee at the American Academy of Hospice and Palliative Medicine. Physicians will also get extra reimbursement for extra time they spend in person with complex patients or their caregivers. Dr. Paul Tatum, an associate professor of clinical family and community medicine at the University of Missouri School of Medicine recently scheduled a half hour for a patient in his mid-70s with high blood pressure, kidney disease, skin issues and cognitive impairment. But the visit ran to 90 minutes when it became clear the gentleman was more confused than ever, falling, not eating well, not taking medications, and needed more help. “Much of what we did for this patient fits in the new Medicare codes, which recognize the extent of what’s needed to care for people with complex illnesses,” the doctor said. Integrating Behavior Health Research has shown the seniors with depression — a frequent complication of serious illness — benefit when primary care physicians collaborate with psychologists or psychiatrists and care managers track their progress. Now, Medicare will begin paying $142.84 for the first 70 minutes that physicians and behavioral health providers work together, $126.33 for the next hour, and $66.04 per half hour for a care manager who stays in touch with patients and tracks whether they’re improving. Care managers may work on site or off; psychologists and psychiatrists will be called for consultations, as needed. “Accessing mental health services is a really big problem for my patients, and having professionals ready to work with me and compensated to do so will be extraordinarily valuable,” said Rodgers of the hospice and palliative medicine academy. We’re eager to hear from readers about questions you’d like answered, problems you’ve been having with your care and advice you need in dealing with the health care system. Visit khn.org/columnists to submit your requests or tips. KHN’s coverage related to aging & improving care of older adults is supported by The John A. Hartford Foundation." | |
}, | |
{ | |
"id" : "1482364800-cost-20and-20quality-medicare-drug-20costs-2016-12-22-medicare-pays-for-a-kidney-transplant-but-not-the-drugs-to-keep-it-viable", | |
"site" : "Kaiser Health News Blog", | |
"title": "Medicare Pays For A Kidney Transplant But Not The Drugs To Keep It Viable", | |
"url": "http://localhost:4000/cost%20and%20quality/medicare/drug%20costs/2016/12/22/medicare-pays-for-a-kidney-transplant-but-not-the-drugs-to-keep-it-viable.html", | |
"categories" : ["Cost and Quality","Medicare","Drug Costs"], | |
"tags" : ["Cost and Quality","Medicare","Drug Costs"], | |
"authors" : ["Richard Harris, NPR News"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : "The federal government will pay more than $100,000 to give someone a kidney transplant, but after three years, the government will often stop paying for the drugs needed to keep that transplanted kidney alive. Constance Creasey is one of the thousands of people who find themselves caught up by this peculiar feature of the federal kidney program. Creasey started kidney dialysis about 12 years ago after her kidneys failed. That meant going to a dialysis center three times a week, for three hours per session. (A typical patient undergoes three to five hours of dialysis per session). “The first three years of dialysis was hard. I walked around with this dark cloud. I didn’t want to live, I really didn’t,” she says. Being dependent on these blood-cleansing machines was physically and emotionally draining. But she stuck it out for 11 years. Medicare pays for dialysis, even for people under the age of 65. It also pays for kidney transplants for people with end-stage renal disease. This copyrighted story comes from [NPR’s Shots blog](http://www.npr.org/sections/health-shots/2016/12/22/506319553/medicare-pays-for-a-kidney-transplant-but-not-the-drugs-to-keep-it-viable). All rights reserved.[](http://www.npr.org/blogs/health/) “Finally, a year and a half ago, transplant came. I was a little apprehensive but I said OK. And I call her Sleeping Beauty, that’s my kidney’s name.” Creasey, a 60-year-old resident of Washington, D.C., no longer needs to spend her days at a dialysis center. She has enough energy for a part-time job at a home furnishing store and time to enjoy life’s simple pleasures. “I was able to do my favorite thing — go to the pool — and I was just loving it because it’s like I had no restrictions now,” she says. But there is still a dark cloud on Creasey’s horizon. Medicare’s kidney program currently pays for a large share of the expensive drugs she needs to take twice a day to prevent her body from rejecting the transplanted kidney. But under federal rules, that coverage will disappear three years after the date of her transplant. “I have a year and a half to prepare, or save,” she says. “How am I going to do this?” She’s already paying copays, premiums and past medical bills. She says she sleeps on the floor because she considers buying a bed a luxury she can’t afford. She has no idea what kind of insurance she’ll be able to get after her Medicare coverage runs out. And she was shocked to discover how big the bills could be. One day she went into the pharmacy to pick up her drugs, and the Medicare payment hadn’t been applied. The pharmacist told her she’d need to pay a $600 copay for the one-month supply. “And I’m like are you kidding me? Six hundred? What am I going to do? I can’t pay that!” A social worker at MedStar Georgetown University Hospital in Washington, D.C., where Creasey got her transplant, sorted that out. But it’s not a permanent solution. The three-year cutoff for Medicare payments is a common problem, says Dr. Matthew Cooper, who runs the kidney transplant program at the hospital. That’s especially so since many people with serious kidney disease have low incomes in the first place. “It’s probably about 30 percent of people who find themselves in a troublesome spot at this 36-month mark,” he says. Some people end up trying to stretch out their drug supplies by not taking them as often as they need to, he says. “We see that a lot.” But this isn’t like skipping a pain pill and bearing the consequences. People lose their transplanted kidneys through organ rejection if they don’t take their medicine religiously. Rita Alloway, a clinical pharmacist at the University of Cincinnati, says she also encounters this false economy. “If we were telling them to take four pills twice a day, they may start taking three pills twice a day without telling us, to extend their coverage that they had for the prescriptions they had,” she says. If people tell her that they can’t afford it, she can help them get the medications for free, Alloway says. But sometimes people are too proud to admit their financial distress, she said. And instead of spending $15,000 a year on these anti-rejection drugs, people go back onto dialysis, which costs $90,000 a year or more. And that’s taxpayers’ money, provided with no time limit. Kevin Longino, CEO of the National Kidney Foundation, says it’s not just affecting the people who have transplants, but those who are on the long list waiting their turn for an organ to become available. “The tragedy is you have so many people on the wait list already, and to have someone unnecessarily have rejection because they can’t afford the drugs and to have to go back into the system — it’s just a difficult thing to explain, why we’re allowing that to happen.” Longino says insurance companies are making the problem even worse. Some have reclassified anti-rejection drugs as “specialty drugs,” and they now require patients to pay for a percentage of the cost, rather than a more traditional fixed copayment. Longino encountered that himself after he had a kidney transplant about a dozen years ago. He says his costs went from $150 a month to $950 a month when his insurance company made that cost-sharing shift. He, Alloway and Cooper have been trying to persuade Congress to pass a bill to fix this problem. Rep. Michael Burgess, R-Texas, and Rep. Ron Kind, D-Wis., have introduced bills more than once, but they have not moved through Congress. Burgess’ office says they plan to try again next year. Lawmakers are concerned about the costs. Severe kidney disease already costs Medicare a staggering $30 billion a year, and there’s no official cost-benefit analysis showing whether covering transplant drugs for everybody would save money overall. “The Medicare [savings] in maintaining this drug coverage is better than putting people on dialysis,” Cooper says. “To me this is a no-brainer. I just cannot understand why we haven’t got to the point where we say Medicare coverage for life for immunosuppressive drugs because people will benefit and money will be saved.” For Constance Creasey, this is not an abstract conversation. “Those pills are my life right now,” she says. “I’m trying not to worry, but it’s hard.”" | |
}, | |
{ | |
"id" : "1482364800-aging-health-20industry-medicare-2016-12-22-medicare-issues-final-rule-for-controversial-plan-to-bundle-some-payments-for-heart-care", | |
"site" : "Kaiser Health News Blog", | |
"title": "Medicare Issues Final Rule For Controversial Plan To Bundle Some Payments For Heart Care", | |
"url": "http://localhost:4000/aging/health%20industry/medicare/2016/12/22/medicare-issues-final-rule-for-controversial-plan-to-bundle-some-payments-for-heart-care.html", | |
"categories" : ["Aging","Health Industry","Medicare"], | |
"tags" : ["Aging","Health Industry","Medicare"], | |
"authors" : [], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : "" | |
}, | |
{ | |
"id" : "1482364800-business-economy-politics-2016-12-22-donald-trump-has-a-favorite-carmaker-and-that-might-be-a-problem", | |
"site" : "Washington Post Blog", | |
"title": "Donald Trump has a favorite carmaker and that might be a problem", | |
"url": "http://localhost:4000/business/economy/politics/2016/12/22/donald-trump-has-a-favorite-carmaker-and-that-might-be-a-problem.html", | |
"categories" : ["Business","Economy","Politics"], | |
"tags" : ["Business","Economy","Politics"], | |
"authors" : ["Max Ehrenfreund and Jim Tankersley"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : " Ed Welburn, left, then-General Motors vice president of global design, watches the world debut of the 2015 Cadillac Escalade with Melania and Donald Trump in 2013\\. (Cadillac) The love affair between Cadillac and Donald Trump peaked in the late 1980s, when they teamed up on a line of limousines called “the Trump Series.” The most luxurious model came with black, Italian-leather seats, aircraft sound insulation, a television and VCR, a cellular telephone, 24-karat-gold plating, a hidden safe and a paper shredder. The car stretched nearly 23½ feet long. “I’m very honored that they built me the first one,” Trump said, unveiling the Golden Edition in 1988, “and, frankly, I deserve it.” Trump has enjoyed a long and sometimes lucrative relationship with Cadillac, the brand his father drove when Trump was a child. General Motors, which makes Cadillacs, was a regular advertiser on Trump’s television show, “The Apprentice,” where its products were sometimes featured in challenges for contestants. Cadillac was a longtime sponsor of a golf tournament at a Trump-owned course in Florida. Trump, in turn, has appeared at launch events to promote GM vehicles on several occasions, including one for the 2015 Cadillac Escalade. Trump is now president-elect, and he has styled himself as a critic in chief of American companies that move factory jobs to foreign countries. On the campaign trail, he made a few mentions of GM, which is in the middle of a $5 billion plan to expand production in Mexico. But the automaker has largely escaped the worst of Trump’s wrath. In a statement sent to reporters in June, Trump criticized GM’s Mexico expansion. “Many companies — like Ford, General Motors, Nabisco, Carrier — are moving production to Mexico,” the statement read. But in the version of the statement posted to the campaign’s website, the reference to GM has been removed. A GM spokesman said Trump has no business relationship with Cadillac. There is no evidence that his past relationships with GM have influenced his conduct as a candidate and president-elect. But the difference in how he has treated GM and, say, Ford — both iconic Detroit automakers — highlights a challenge Trump will face as president: how to avoid the appearance of playing favorites with companies he has done business with. Trump frequently criticized Ford on the campaign trail for its plans to move small-car production to Mexico. Last month, he announced that he had persuaded Ford not to move a production line of sport-utility vehicles from Kentucky to Mexico, a decision that the company and union officials said did not affect any American jobs. In a statement, Hope Hicks, a spokeswoman for Trump, said GM, “along with many other companies he is speaking to, will be encouraged to keep jobs here in the United States.” But Trump has not criticized GM for its plans to import an SUV built in China, or for large investments in production facilities in South Korea. He did not speak out when GM announced last month that it will lay off 2,000 workers at factories in Ohio and Michigan, nor when the company said this week that it would lay off 1,300 workers in Detroit. (The company blamed both moves on softening U.S. demand for smaller cars and sedans; one of the sedans produced at the Ohio plant facing layoffs is the Cruze, which is produced, in hatchback form, in Mexico.) GM produces about 19 percent of the cars it sells in North America in Mexico, similar to the figure for Fiat-Chrysler, according to data from the Center for Automotive Research in Ann Arbor, Mich.; Ford produces about 12 percent there. There are GM plants at four locations in Mexico, and the company announced two years ago that new investments would create about 5,600 new positions for Mexican workers. GM also imports the Buick Envision from China, a decision that has drawn the ire of organized labor. Workers have nicknamed the Envision the “Invasion.” On the campaign trail, Trump largely stayed mum on those decisions, even as he criticized Ford. For instance, in the opening minutes of his first debate with Democratic nominee Hillary Clinton, Trump falsely claimed Ford had plans to lay off employees. “Ford is leaving — you see that,” Trump said. “Their small-car division — thousands of jobs, leaving Michigan, leaving Ohio.” While Ford does plan to relocate its small-car production to Mexico, the company saidits employees in the United States will continue working, building larger vehicles. One of the few moments when Trump mentioned GM on the trail came in Grand Rapids, Mich. a week before the election. There, he used layoffs at one of the company’s plants to attack not the company but his Democratic opponent. “GM laid off 314 workers at the Lake Orion assembly plant in 2013 because of imports from the South Korean trade deal pushed by Hillary,” Trump said. After winning the election, Trump appointed GM’s chairman and chief executive, Mary Barra, to an economic advisory panel.  Cadillac Vice President of Marketing Don Butler and Ivanka Trump pose at the release party for a luxury art book published by Assouline in New York in December 2012\\. (Cadillac). In the past, Trump was a regular at the automaker’s events. In 2005, he appeared at the New York International Auto Show with Bob Lutz, then GM’s vice chairman. The pair held a news conference on the new Cadillac XLR-V, a muscle car, according to news reports. In 2013, Donald and Melania Trump appeared at the debut of the latest Escalade. The couple sat with Ed Welburn, then the company’s vice president for global design. “I think they’ve done a great job, really great job. They’ve really done a fantastic job. We love it,” Trump said, according to footage of that event from Cadillac. Melody Lee, the director of brand marketing at Cadillac, said the company frequently invites “celebrities and influencers” to its events. “It’s a standard process,” she said, “and the Trumps must have been on that invitation list.” Last year, his campaign told The Washington Post that a Cadillac Escalade was one of two American cars that Trump owned at the time, along with a Tesla. GM also promoted its products on “The Apprentice,” his reality-television show. The Pontiac appeared on the show in 2005 and 2006, according to a statement from the company. Also in 2006, contestants were tasked with planning a three-hour training session for Chevrolet dealers, focused on the Tahoe. Luigi Zingales, an economist at the University of Chicago, said Trump’s divergent treatment of GM and Ford on the stump raises the specter of government favoritism to specific companies, a practice economists generally believe can hurt the economy. “It looks more and more [like] the behavior of a crony capitalist,” Zingales said. “The issue is you don’t want to be in a country where the president picks and chooses who to blast and who to promote in completely arbitrary fashion.” Not every past partnership with Trump has worked out for GM. The Trump line of limos, for example, never really made it to the road. At the unveiling, Trump called the car “the ultimate limousine to be found anywhere in the world.” In his book “The Art of the Deal,” Trump wrote that he received a “beautiful gold Cadillac Allanté” as a gift from the company on completing the deal. GM says today that only three of the Trump-edition cars were ever produced." | |
}, | |
{ | |
"id" : "1482364800-digital-20marketing-market-20research-advertising-artificial-20intelligence-b2b-20content-20marketers-2016-12-22-10-marketing-trends-to-watch-for-2017", | |
"site" : "biznology Blog", | |
"title": "10 Marketing trends to watch for 2017", | |
"url": "http://localhost:4000/digital%20marketing/market%20research/advertising/artificial%20intelligence/b2b%20content%20marketers/2016/12/22/10-marketing-trends-to-watch-for-2017.html", | |
"categories" : ["Digital Marketing","Market Research","Advertising","Artificial intelligence","B2B content marketers"], | |
"tags" : ["Digital Marketing","Market Research","Advertising","Artificial intelligence","B2B content marketers"], | |
"authors" : ["Richard Larson"], | |
"publishedDate" : "2016-12-22 00:00:00 +0000", | |
"content" : "As the end of the year approaches, it is time to look back on the year in marketing and analyze the trends we should expect to see in the coming new year. This helps you, as a marketer or business owner, be prepared for and to properly plan out your marketing for 2017. Here are 10 of the marketing trends to watch for 2017. Content marketing getting better One of the trends that will carry over from 2016 is that marketers are getting better at content marketing. By better, we mean they are having more success with content marketing campaigns. A MarketingProfs report shows that 62% of B2B content marketers are “much more” or “somewhat more” successful with their content marketing than they were last year. The same goes for B2C marketers as well. For the same survey last year, the reported success rates were at just 30%. Of course, this is a good thing because content is so important. However, it does mean being prepared to allocate more budget to content marketing. Mobile, mobile, mobile We know, you’ve been hearing about mobile for quite a few years. It may sound like a broken record, but that’s only because it is so important. Mobile devices are the source of more traffic on the internet than desktops. The other thing to think about when planning your mobile marketing strategy is that you must consider your mobile target audience as if they were moving targets. You, as a marketer, have unique insight to so much of your customers’ lives and activities thanks to mobile data and permissioned data. You know that your target audience’s’ needs change throughout the day and thanks to all that mobile data, you can customize your marketing to target your audience where they are and accordingly for what they are doing. Visual content Visuals are so important to all of your marketing efforts, not just social media. Visuals can help with your branding, help you stand out, get recognized, and be remembered. If you haven’t embraced visuals yet, it’s never too late to start incorporating them into your plan for 2017. How can you get started? Hire a photographer to take photos of your company and your products. Make sure a good portion of the photos are timeless so that you can continue to use them for different advertising and promotional campaigns. If you don’t have a design team on hand, look into some design apps that take the work out of photo editing and make some creative and catchy photos. Video content Of course, video content could fall under the visual content category we just mentioned above, but it deserves its own bullet point. Static images are not enough. Video has become so important. Video literally reaches out and grabs your audience’s attention. It doesn’t give viewers a choice, it just begins. Try both funny and emotional videos to really connect. Marketing to customers in-store Chances are you’ve done it yourself. Have you been shopping lately when you do a price check with competitors while you are in a store? The only way to compete with this is to market to your customers while they are in your stores. Native advertising Expect to see an upswing in native advertising in 2017. The formats are getting better and more improved. If you shied away from native advertising, you might feel more comfortable with trying it if it is right for your brand. More major publishers are likely to start offering this method of advertising in 2017. Chatbots and AI gaining in popularity Earlier this year, Mark Zuckerberg announced that third parties would be able to access the Facebook Messenger platform. As a result chatbots have grown quickly in popularity. Using chatbots is a slippery slope, but can definitely help increase the speed of engagement and allows for real-time engagement. Influencers increasingly important Influencers have been becoming increasingly important tools for marketers to use. It is the best form of user-generated content. The public is quick to believe what others have to say about your product. When those “others” have a huge audience full of believers, your message reach grows exponentially and in a positive direction. Organic traffic is on the decline After spending the time and effort to develop the best content, it can often be lost and never seen when you post it across social channels expecting organic content to go viral. Now that Facebook and Instagram do not post content chronologically, most users will only see the most popular posts. How do you get those posts to be popular in the first place? It likely isn’t going to happen organically. As a result, plan for 2017 to start paying for your “viral” content. Live and vicarious experiences will increase If you haven’t noticed the popularity of streaming video content, then you haven’t been paying attention. In 2016, Facebook Live introduced a whole new novice audience to live-streaming video. Facebook made it so accessible to the public, that now consumers will be more open to see live-streaming advertising on different platforms. Live streaming is beginning a new trend in vicarious experiences and will cross over into other immersive experience types of content like 360 degree video and augmented reality. Social analytics Social analytics will become increasingly important, just as important as other analytics you use in your marketing analyses. Social listening and collecting social data will help you through 2017. Be sure to follow this trend and look into tools that will help you process all of the valuable social analytics information that is out there, waiting to be mined. Like this post? Sign up for our emails here. The post 10 Marketing trends to watch for 2017 appeared first on Biznology." | |
}, | |
{ | |
"id" : "1482278400-health-20industry-medicare-public-20health-syndicate-hospitals-2016-12-21-latest-hospital-injury-penalties-include-crackdown-on-antibiotic-resistant-germs", | |
"site" : "Kaiser Health News Blog", | |
"title": "Latest Hospital Injury Penalties Include Crackdown On Antibiotic Resistant Germs", | |
"url": "http://localhost:4000/health%20industry/medicare/public%20health/syndicate/hospitals/2016/12/21/latest-hospital-injury-penalties-include-crackdown-on-antibiotic-resistant-germs.html", | |
"categories" : ["Health Industry","Medicare","Public Health","Syndicate","Hospitals"], | |
"tags" : ["Health Industry","Medicare","Public Health","Syndicate","Hospitals"], | |
"authors" : ["Jordan Rau"], | |
"publishedDate" : "2016-12-21 00:00:00 +0000", | |
"content" : "The federal government has cut payments to 769 hospitals with high rates of patient injuries, for the first time counting the spread of antibiotic-resistant germs in assessing penalties. The punishments come in the third year of Medicare penalties for hospitals with patients most frequently suffering from potentially avoidable complications, including various types of infections, blood clots, bed sores and falls. This year the government also examined the prevalence of two types of bacteria resistant to drugs. Based on rates of all these complications, the hospitals identified by federal officials this week will lose 1 percent of all Medicare payments for a year — with that time frame beginning this past October. While the government did not release the dollar amount of the penalties, they will exceed a million dollars for many larger hospitals. In total, hospitals will lose about $430 million, 18 percent more than they lost last year, according to an estimate from the Association of American Medical Colleges. The reductions apply not only to patient stays but also will reduce the amount of money hospitals get to teach medical residents and care for low-income people. Get the data Looking for more on your hospital or state? * **Full List:** [769 Hospitals Penalized For Patient Safety In 2017: Data Table](http://khn.org/Njg1NzMw) * [Download as PDF](https://kaiserhealthnews.files.wordpress.com/2016/12/hac-pdf.pdf) * [Download as CSV](https://kaiserhealthnews.files.wordpress.com/2016/12/hac-csv.csv) Forty percent of the hospitals penalized this year escaped punishment in the first two years of the program, a Kaiser Health News analysis shows. Those 306 hospitals include the University of Miami Hospital in Florida, Cambridge Health Alliance in Massachusetts, the University of Michigan Health System in Ann Arbor and Mount Sinai Hospital in New York City. Nationally, hospital-acquired conditions declined by 21 percent between 2010 and 2015, according to the federal Agency for Healthcare Research and Quality, or AHRQ. The biggest reductions were for bad reactions to medicines, catheter infections and post-surgical blood clots. Still, hospital harm remains a threat. AHRQ estimates there were 3.8 million hospital injuries last year, which translates to 115 injuries during every 1,000 patient hospital stays during that period. Each year, at least 2 million people become infected with bacteria that are resistant to antibiotics, including nearly a quarter million cases in hospitals. The Centers for Disease Control and Prevention estimates 23,000 people die from them. Infection experts fear that soon patients may face new strains of germs that are resistant to all existing antibiotics. Between 20 and 50 percent of all antibiotics prescribed in hospitals are either not needed or inappropriate, studies have found. Their proliferation — inside the hospital, in doctor’s prescriptions and in farm animals sold for food — have hastened new strains of bacteria that are resistant to many drugs. One resistant bacteria that Medicare included into its formula for determining financial penalties for hospitals is methicillin-resistant Staphylococcus aureus, or MRSA, which can cause pneumonia and bloodstream and skin infections. MRSA is prevalent outside of hospitals and sometimes people with it show no signs of disease. But these people can bring the germ into a hospital, where it can be spread by health care providers and be especially dangerous for older or sick patients whose immune system cannot fight the infection. Hospitals have had some success in reducing MRSA infections, which dropped by 13 percent between 2011 and 2014, according to the CDC. AHRQ estimates there were 6,300 cases in hospitals last year. The second bacteria measured for the penalties is Clostridium difficile, known as C. diff, a germ that can multiply in the gut and colon when patients take some antibiotics to kill off other germs. It can also spread through contaminated surfaces or hands. This KHN story also ran on [NPR](http://www.npr.org/sections/health-shots/2016/12/22/506489368/medicare-penalizes-hospitals-in-crackdown-on-antibiotic-resistant-infections). It can be republished for free ([details](/syndication)). [](http://www.npr.org/sections/news/) While it can be treated by antibiotics, C. diff can also become so serious that some patients need to have part of their intestines surgically removed. C. diff can cause diarrhea and can be deadly for the elderly and other vulnerable patients. C. diff has challenged infection control efforts. While hospital infections dropped 8 percent from 2008 to 2014, there was a “significant increase” in C. diff that final year, the CDC says. AHRQ estimated there were 100,000 hospital cases last year. “The reality is we don’t know how to prevent all these infections,” said Dr. Louise Dembry, a professor at the Yale School of Medicine and president of the Society for Healthcare Epidemiology of America. The Hospital-Acquired Condition Reduction Program also factors in rates of infections from hysterectomies, colon surgeries, urinary tract catheters and central line tubes. Those infections carry the most weight in determining penalties, but the formula also takes into account the frequency of bed sores, hip fractures, blood clots and four other complications. Specialized hospitals, such as those that treat psychiatric patients, veterans and children, are exempted from the penalties, as are hospitals with the “critical access” designation for being the only provider in an area. Of the remaining hospitals, the Affordable Care Act requires that Medicare penalize the 25 percent that perform the worst on these measures, even if they have reduced infection rates from previous years. That inflexible quota is one objection the hospital industry has with the penalties. In addition, many hospitals complain that they are penalized because of their vigilance in detecting infections, even ones that do not cause any symptoms in patients. Academic medical centers in particular have been frequently punished. More HAC Coverage [2015: Medicare Penalizes 758 Hospitals For Safety Incidents](http://khn.org/news/medicare-penalizes-758-hospitals-for-safety-incidents/) [2014: 721 Hospitals Penalized For Patient Safety](http://khn.org/news/721-hospitals-penalized-for-patient-safety/) “The HAC penalty payment program is regarded as rather arbitrary, so other than people getting upset when they incur a penalty, it is not in and of itself changing behavior,” said Nancy Foster, vice president for quality and patient safety at the American Hospital Association. Federal records show that 347 hospitals penalized last year will not have payments reduced because their performance was better than others. Those include Harbor-UCLA Medical Center in Los Angeles, the Johns Hopkins Hospital in Baltimore and the University of Tennessee Medical Center in Knoxville. Over the lifetime of the penalty program, 241 hospitals have been punished in all three years, including the Cleveland Clinic; Intermountain Medical Center in Murray, Utah; Ronald Reagan UCLA Medical Center in Los Angeles; Grady Memorial Hospital in Atlanta; Northwestern Memorial Hospital in Chicago; and Brigham & Women’s Hospital in Boston. The penalties come as the Centers for Medicare & Medicaid Services also launches new requirements for hospitals to ensure that the use of antibiotics is limited to cases where they are necessary and be circumspect in determining which of the drugs are most likely to work for a given infection. Hospitals will have to establish these antibiotic stewardship programs as a condition of receiving Medicare funding under a regulation the government drafted last summer. Lisa McGiffert, who directs Consumers Union’s Safe Patient Project, said that as a result of Medicare’s penalties and other efforts, “more hospitals are thinking more about appropriate use of antibiotics.” However, she said, “I think most hospitals do not have effective antibiotic stewardship programs yet.”" | |
}, | |
{ | |
"id" : "1482278400-health-20industry-medicare-syndicate-hospitals-2016-12-21-hospitals-and-surgery-centers-play-tugofwar-over-america-s-ailing-knees", | |
"site" : "Kaiser Health News Blog", | |
"title": "Hospitals And Surgery Centers Play TugOfWar Over America’s Ailing Knees", | |
"url": "http://localhost:4000/health%20industry/medicare/syndicate/hospitals/2016/12/21/hospitals-and-surgery-centers-play-tugofwar-over-america-s-ailing-knees.html", | |
"categories" : ["Health Industry","Medicare","Syndicate","Hospitals"], | |
"tags" : ["Health Industry","Medicare","Syndicate","Hospitals"], | |
"authors" : ["Christina Jewett"], | |
"publishedDate" : "2016-12-21 00:00:00 +0000", | |
"content" : "Five years ago, Dr. Ira Kirschenbaum, an orthopedic surgeon in the Bronx who replaces more than 200 knees each year, would have considered it crazy to send a patient home the same day as a knee replacement operation. And yet there he was this year, as the patient, home after a few hours. A physician friend pierced his skin at 8 a.m. at a Seattle-area surgery center. By lunch, Kirschenbaum was resting at his friend’s home, with no pain and a new knee. “I’m amazed at how well I’m doing,” Kirschenbaum, 59, said recently in a phone interview, nine weeks after the operation. What felt to Kirschenbaum like a bold experiment may soon become far more standard. Medicare, which spends several billions of dollars a year on knee replacements for its beneficiaries — generally Americans 65 and over — is contemplating whether it will help pay for knee replacement surgeries outside the hospital, either in free-standing surgery centers or outpatient facilities. The issue is sowing deep discord in the medical world, and the debate is as much about money as medicine. Some physicians are concerned that moving the surgeries out of hospitals will land vulnerable patients in the emergency room with uncontrolled pain, blood clots or other complications. This KHN story also ran in [The New York Times](http://www.nytimes.com/2016/12/20/business/medicare-outpatient-knee-replacement-.html). It can be republished for free ([details](/syndication)). [](http://www.nytimes.com/) But proponents of the change say it can give patients more choice and potentially better care, as well as save Medicare hundreds of millions of dollars. Already, an “overwhelming majority” of commenters said they want to allow the surgeries out of hospitals, according to recent rule-making documents. The final decision, which could come within a year, would also act as a test of sorts for Donald Trump and his new administration. They will weigh whether to limit government controls, as Trump has often suggested, or to bend to pressure from hospitals and doctors, many of whom oppose the change. “I think the question will come down to two things,” said David Muhlestein, senior director for research at Leavitt Partners, a leading health consulting firm. “It’s the balance of trying to reduce regulations and let the market function — and the competing interest of vested parties.” Demand for total knee replacements is growing — 660,000 are performed each year in the United States. That number is likely to jump to two million annually by 2030, making this complex and expensive operation one of surgery’s biggest potential growth markets. Even if the policy change is made, Medicare would still pay for patients to get traditional inpatient surgery. But with the agency also paying for the bulk of outpatient procedures, there would be a huge shift in money — out of hospitals and into surgery centers. Medicare could save hundreds of millions of dollars if it no longer needed to pay for multiple-day stays at the hospital. Investors at the outpatient centers could profit greatly, as could some surgeons, because doctors often have an ownership stake in the outpatient centers where they operate. Whether the shift is beneficial for patients remains an open question. Medicare patients tend to spend nearly three days in a hospital, data shows. Forty percent of Medicare patients also spend time in a rehabilitation facility for further recovery. The data, which reflects knee replacement operations from 2014, suggests that Medicare patients are taking advantage of the post-operation support at hospitals and aftercare centers. Given that, it is unclear the percentage of eligible patients who would choose outpatient care. But improvements in surgery — from new medicines to control bleeding to better pain management techniques — mean that, for some patients, the days of close medical supervision are no longer necessary. Kirschenbaum, who is in favor of the change, acknowledged that outpatient surgery would be the right move for only a small subset of his Medicare patients — perhaps 10 to 15 percent — who have good caretaking at home and few chronic health issues. But it would not be for the people who are frail, live alone or in a dwelling with stairs, he said. The decision about whether an outpatient surgery should be done instead of an inpatient one tends to be made by the physician and patient. “We want to make sure patients — when they go home, they’re safe, no question,” said Kirschenbaum, the chairman of orthopedics at Bronx-Lebanon Hospital Center and a founder of SwiftPath, a company that offers technical support to outpatient joint replacement centers. Perhaps of equal concern to patients are the financial consequences, because even though less care is given, outpatient procedures require higher out-of-pocket costs for patients. Medicare covers inpatient hospital stays, aside from a $1,288 deductible. While Medicare rules stipulate that the outpatient would pay no more than this amount for the procedure itself, he could face additional fees for items like medicines, and Medicare would not cover aftercare at a skilled nursing facility. The battle lines over outpatient knee replacements began forming in 2012, when Medicare first considered removing the surgeries from its “inpatient only” list of invasive and complicated medical procedures. Many orthopedic doctors and hospitals rose up in protest, calling the proposal “ludicrous” and “dangerous” and prompting Medicare to abandon the idea. Dr. Charles Moon, who has performed knee replacement surgeries at Cedars-Sinai Medical Center in Los Angeles, fired off a letter at the time saying that knee replacement patients stayed at his hospital for 2.5 days on average, and that that was “considered borderline safe” given the need to monitor patients’ response to clot-busting medications. Other objectors cited research showing that patients who received knee replacements as outpatients were twice as likely to die shortly afterward, and that even one-day-stay hospital patients were twice as likely to need a follow-up surgery, compared with those who remained inpatients longer. “While we realize this can be good for some patients, it’s not for all patients and all locations,” said Dr. Thomas C. Barber, the chairman for the American Academy of Orthopaedic Surgeons’ advocacy council. Yet the proposal has gained renewed momentum, backed aggressively by some surgeons and surgery center investors who say that their accumulating experience justifies the change. In recent months, Medicare has signaled a strong interest in outpatient knee replacements, noting the potential for “overall improved outcomes” as well as the potential savings for the government program. The final decision is made by Medicare officials in the annual course of proposing changes, seeking public input and announcing a final rule. If Medicare does decide to make a change, it would probably not be put into effect until a year or so later. In an interview, Thomas Wilson, the chief executive of the for-profit Monterey Peninsula Surgery Centers, an outpatient clinic, said his doctors have replaced knees of hundreds of adults — 59 years old on average, but up to 82 — with low complication rates and sky-high satisfaction rates. He said advances in surgical technique, anesthetics and patient education make it possible. Presented with such evidence, a panel that recommends hospital outpatient payment policies to Medicare officials unanimously recommended in August that Medicare remove the procedure from the “inpatient only” payment list. Wilson said that as a first step, doctors should use strict criteria for choosing which patients are good candidates, like a low to moderate body mass index and a healthy heart and lungs. Patients who meet the criteria are teamed with a friend or family member who works as a coach. The patient and coach attend an educational session before the operation, and the coach is also there to help after. The patient is typically discharged after 23 hours in the outpatient center, and a home health service or private nurse follows up. Patients also go on to physical therapy. “Our mix is like our regular mix of patients,” said Wilson, whose center advertises a knee replacement surgery for $17,030. “It’s not what we call unicorns, not 49-year-old marathon runners. These are average folks who need to have a knee or hip replaced and they’re generally not sick.” But Barber and others worry that moving the procedure outside the hospital could become a norm or an expectation, even though some patients, especially those with complicating conditions like diabetes and heart disease, need the added support of a hospital team. Patient safety could be compromised, they warned. Kirschenbaum said undergoing surgery has changed the way he approaches patients. Now he can roll up his pant leg, show a scar and tell them: “You can do this, too.” In the operating room, “with a knife in my hand, nothing has changed,” he said. “But what has changed is how we treat them before and after. The education, support and being available — it’s very important.” This story has been corrected. An earlier version of this article misstated Medicare’s policy on certain outpatient surgeries. For surgeries that can be done either as an inpatient or an outpatient, outpatients can be charged no more than the inpatient deductible for the procedure itself; the usual 20 percent outpatient copay doesn’t apply. KHN’s coverage related to aging & improving care of older adults is supported by The John A. Hartford Foundation." | |
}, | |
{ | |
"id" : "1482278400-digital-20marketing-social-20media-pr-artwork-benefits-brand-20showcase-2016-12-21-8-commonly-overlooked-ways-to-make-social-media-work-for-you", | |
"site" : "biznology Blog", | |
"title": "8 commonly overlooked ways to make social media work for you", | |
"url": "http://localhost:4000/digital%20marketing/social%20media/pr/artwork/benefits/brand%20showcase/2016/12/21/8-commonly-overlooked-ways-to-make-social-media-work-for-you.html", | |
"categories" : ["Digital Marketing","Social Media/PR","artwork","benefits","brand showcase"], | |
"tags" : ["Digital Marketing","Social Media/PR","artwork","benefits","brand showcase"], | |
"authors" : ["Derek Miller"], | |
"publishedDate" : "2016-12-21 00:00:00 +0000", | |
"content" : "Are you getting the most out of social media? We’ll safely assume that you and your business are using social media. You probably have Facebook, Twitter, Instagram, and Google Plus accounts. Maybe you post content daily to all of your accounts. But are you making the most out of them, and are you reaping the benefits? Discover the following eight commonly overlooked ways to make social media work for you. Choose platforms wisely Even though multiple social media platforms exist, you don’t have to have an account for each one. Instead, think through your choices and choose the ones that make the most sense for the brand you are trying to promote. For example, if you are promoting artwork, Pinterest and Instagram are must-have accounts. However, if you are promoting computer services, you would be better off using Twitter or Facebook. Additionally, if you focus on a few accounts instead of multiple accounts, you are more likely to post regularly, keeping your content fresh and up-to-date for your readers. Use graphics Research shows that images draw emotion. Why is emotion important? Since emotions are what attract people’s attention to one business over another, you can use emotion to best promote your business. When you make an emotional connection, you are more likely to get viewers to digest the rest of your content or buy the products you are trying to sell. Use graphics, images, and videos to appeal to your readers, and watch your social media ratings soar. Create effective headlines Before people click on a post, you have to get their attention. How do you get their attention? Try writing effective headlines. Headlines need to grab readers’ attentions. Ask a question, make a shocking statement, or state your topic in a way that piques interest. You also need to give enough information so that readers know what they are going to discover. If your headline reads, “You’re Missing Out,” will readers know what you are talking about? Instead, title your headline “You’re Missing Out: 5 Ways to Improve Your Google Search Rating.” Gain the interest of all readers looking to improve their website search rankings. Don’t oversell Social media is a fantastic tool to grow your business and showcase your brand. However, too much selling is off-putting to many people. Unfortunately, when you have a goal of promoting or selling your brand, you can easily want to push your products and oversell your audience. Instead, focus on providing tips or information your shoppers appreciate and sparsely mix in your goods and services to attract more customers and long-term viewership. Develop specific content for each platform As an individual, you can get away with sharing the same post or comment on all of your social media sites, but as a business owner, this practice won’t work. For you, a business owner, to be productive and gain the trust and loyalty of social media followers, you need to post platform-specific content to all of your accounts. By creating distinctive content, you prevent readers from getting bored and promote connection through your business presence on all social media platforms. Show your personality How do you stand out from your competitors? You show your personality. No matter what you are selling or what services you offer, your distinction comes from who you are. Your posts need to reflect your ideas, your thoughts, and your feelings. All of these techniques show consumers who you are and what’s behind your brand. They provide more ways for readers to connect with you. Again, feelings are incredibly powerful, and when readers feel like they know you and relate to you, they are more likely to buy products from you or return to you for a service. Don’t post old news No one wants to hear old news over and over. In fact, your audience members want to be the ones who say to their friends, “Did you hear…?” When deciding what to post, do you best to share new information or a new twist on something old. Give readers something new to share; don’t simply reshare yesterday’s news. Ask questions Social media is interactive, and this interactivity is one of the reasons people love using social media so much. When you use social media for business purposes, you need to keep this point in mind and find ways to interact with your customers. An excellent way to create this interaction is by adding questions to the end of your posts or by tweeting questions tied to your posts. Invite your viewers to engage in conversation and share their thoughts and comments to create a relationship with them. Using social media to promote your business is a wise idea. Be sure you incorporate the above eight overlooked ways into your promotion plans to make social media work for you so that you don’t miss out on the benefits to your business. Like this post? Sign up for our emails here. The post 8 commonly overlooked ways to make social media work for you appeared first on Biznology." | |
}, | |
{ | |
"id" : "1482192000-aging-health-20industry-medicare-public-20health-syndicate-hospitals-nursing-20homes-ratings-2016-12-20-when-looking-for-a-nursing-home-you-may-get-little-help-from-your-hospital", | |
"site" : "Kaiser Health News Blog", | |
"title": "When Looking For A Nursing Home You May Get Little Help From Your Hospital", | |
"url": "http://localhost:4000/aging/health%20industry/medicare/public%20health/syndicate/hospitals/nursing%20homes/ratings/2016/12/20/when-looking-for-a-nursing-home-you-may-get-little-help-from-your-hospital.html", | |
"categories" : ["Aging","Health Industry","Medicare","Public Health","Syndicate","Hospitals","Nursing Homes","Ratings"], | |
"tags" : ["Aging","Health Industry","Medicare","Public Health","Syndicate","Hospitals","Nursing Homes","Ratings"], | |
"authors" : ["Jordan Rau"], | |
"publishedDate" : "2016-12-20 00:00:00 +0000", | |
"content" : "At age 88, Elizabeth Fee looked pregnant, her belly swollen after days of intestinal ailments and nausea. A nurse heard a scream from Fee’s room in a nursing home, and found her retching “like a faucet” before she passed out. The facility where she died in 2012 was affiliated with a respected San Francisco hospital, California Pacific Medical Center, and shared its name. Fee had just undergone hip surgery at the hospital, and her family, pleased with her care, said they chose the nursing home with the hospital’s encouragement. Laura Rees, Fee’s elder daughter, said she was never told that the nursing home had received Medicare’s worst rating for quality — one star. Nor, she said, was she told that state inspectors had repeatedly cited the facility for substandard care, including delayed responses to calls for aid, disrespectful behavior toward patients and displaying insufficient interest in patients’ pain. “They handed me a piece of paper with a list of the different facilities on it, and theirs were at top of the page,” Rees said in an interview. “They kept pointing to their facility, and I was relying on their expertise and, of course, the reputation of the hospital.” Fee had an obstructed bowel, and state investigators faulted the home for several lapses in her care related to her death, including giving her inappropriate medications. In court papers defending a lawsuit by Fee’s family, the medical center said the nursing home’s care was diligent. The center declined to discuss the case for this story. This KHN story also ran in [The Washington Post](https://www.washingtonpost.com/national/health-science/when-youre-looking-for-a-nursing-home-your-hospital-may-offer-little-help/2016/12/16/ee2ad9ee-b58f-11e6-b8df-600bd9d38a02_story.html?utm_term=.0ee6b3e18dc0). It can be republished for free ([details](/syndication)). [](http://www.washingtonpost.com/) The selection of a nursing home can be critical: 39 percent of facilities have been cited by health inspectors over the past three years for harming a patient or operating in such a way that injuries are likely, government records show. Yet many case managers at hospitals do not share objective information or their own knowledge about nursing home quality. Some even push their own facilities over comparable or better alternatives. “Generally hospitals don’t tell patients or their families much about any kind of patterns of neglect or abuse,” said Michael Connors, who works at California Advocates for Nursing Home Reform, a nonprofit in San Francisco. “Even the worst nursing homes are nearly full because hospitals keep sending patients to them.” Hospitals say their recalcitrance is due to fear about violating a government decree that hospitals may not “specify or otherwise limit” a patient’s choice of facilities. But that rule does not prohibit hospitals from sharing information about quality, and a handful of health systems, such as Partners HealthCare in Massachusetts, have created networks of preferred, higher-quality nursing homes while still giving patients all alternatives. Such efforts to help patients are rare, said Vincent Mor, a professor of health services, policy and practice at the Brown University School of Public Health in Providence, R.I. He said that when his researchers visited 16 hospitals around the country last year, they found that only four gave any quality information to patients selecting a nursing home. “They’re giving them a laminated piece of paper” with the names of nearby nursing facilities, Mor said. For quality information, he said, “they will say, ‘Well, maybe you can go to a website,’” such as Nursing Home Compare, where Medicare publishes its quality assessments. The federal government may change this hands-off approach by requiring hospitals to provide guidance and quality data to patients while still respecting a patient’s preferences. The rule would apply to information not only about nursing homes but also about home health agencies, rehabilitation hospitals and other facilities and services that patients may need after a hospital stay. “It has a substantial opportunity to make a difference for patients,” said Nancy Foster, a vice president at the American Hospital Association. Even the worst nursing homes are nearly full because hospitals keep sending patients to them. Michael Connors But the rule does not spell out what information the hospitals must share, and it has yet to be finalized — more than a year after Medicare proposed it. The rule faces resistance in Congress: The chairman of the House Freedom Caucus, Rep. Mark Meadows, R-N.C., has included it on a list of regulations Republicans should block early next year. The government has created other incentives for hospitals to make sure their patient placements are good. For instance, Medicare cuts payments to hospitals when too many discharged patients return within a month. “Hospitals didn’t use to care that much,” said David Grabowski, a professor of health care policy at Harvard Medical School. “They just wanted to get patients out. Now there’s a whole set of payment systems that reward hospitals for good discharges.” But sometimes hospitals go too far in pushing patients toward their own nursing homes. In 2013, for instance, regulators faulted a Wisconsin hospital for not disclosing its ties when it referred patients to its own nursing home, which Medicare rated below average. In 2014, a family member told inspectors that a Massachusetts hospital had “steered and railroaded” her into sending a relative to a nursing home owned by the same health system. Researchers have found that hospital-owned homes are often superior to independent ones. Still, a third of nursing homes owned by hospitals in cities with multiple facilities had lower federal quality ratings than at least one competitor, according to a Kaiser Health News analysis. The Lowest Rating  A family photo of Elizabeth Fee shortly before she died in 2012\\. (Robert Durell for KHN) Medicare’s Nursing Home Compare gave the nursing home where Elizabeth Fee died one star out of five, meaning it was rated “much below average.” The hospital’s case managers told Fee’s family that the nursing home was merely an extension of the hospital and that “my mother would receive the same excellent quality of care and attention,” said Rees, her daughter. But state inspectors found shortcomings in seven visits to the nursing home between August 2009 and October 2011, records show. Inspectors found expired medications during two visits and, at another, observed a nurse washing only her fingertips after putting an IV in a patient with a communicable infection. Just four months before Fee arrived, inspectors cited the nursing home for not treating patients with dignity and respect and for failing to provide the best care. One patient told inspectors that her pain was so excruciating that she couldn’t sleep but that nurses and the doctor did not check to see whether her pain medications were working. “Nobody listens to me,” the patient said. “I was born Catholic, and I know it’s not right to ask to die, but I want to die just to get rid of the pain.” Fee ate little and had few bowel movements, according to the state health investigation. Fee’s family had hired a private nurse, Angela Cullen, to sit with her. Cullen became increasingly worried about Fee’s distended belly, according to Cullen’s affidavit taken as part of the lawsuit. She said her concerns were brushed off, with one nurse declining to check Fee’s abdomen by saying, “I do not have a stethoscope.” On the morning of her death, an X-ray indicated Fee might have a bowel obstruction or other problem expelling stool, the inspectors’ report said. That evening, after throwing up a large quantity of matter that smelled of feces, she lost consciousness. She died of too much fluid and inhaled fecal matter in her lungs, the report said. Bills Of More Than $150,000  An undated family photo of Elizabeth Fee as a fashion model. (Robert Durell for KHN) In a court ruling, Judge Ernest Goldsmith of the San Francisco Superior Court wrote that Elizabeth Fee’s younger daughter, Nancy, “observed her mother drown in what appeared to be her own excrement.” Kathryn Meadows, the family’s attorney, said in a court filing that the nursing home’s bills exceeded $150,000 for the three-week stay. Sutter Health, the nonprofit that owns the medical center and the nursing home, emphasized in court papers that Elizabeth Fee arrived at the facility with a low count of platelets that clot blood. Sutter’s expert witness argued that the near-daily visits from a physician that Fee received “far exceeds” what is expected in nursing home care. The physician and his medical group have settled their part of the case and declined to comment or discuss the terms; the case against Sutter is pending. California’s public health department fined Sutter $2,000 for the violations, including for delaying 16 hours in telling the physician about Fee’s nausea, vomiting and swollen abdomen. Last year, Sutter closed the nursing home. A week or so after Fee died, a letter addressed to her from California Pacific Medical Center arrived at her house. It read: “We would appreciate hearing about your level of satisfaction with the care you received on our Skilled Nursing Rehabilitation Unit, the unit from which you were just discharged.” KHN’s coverage related to aging & improving care of older adults is supported by The John A. Hartford Foundation. Coverage of aging and long-term care issues is supported by The SCAN Foundation. Clarification: This story was updated to correct a reference to Elizabeth Fee." | |
}, | |
{ | |
"id" : "1482192000-content-20marketing-organic-20search-search-20engine-20optimization-2015-20oxnard-20train-20derailment-big-20data-2016-12-20-feed-google-fresh-sandwiches-every-day-instead-of-christmas-dinner-onceayear", | |
"site" : "biznology Blog", | |
"title": "Feed Google fresh sandwiches every day instead of Christmas dinner onceayear", | |
"url": "http://localhost:4000/content%20marketing/organic%20search/search%20engine%20optimization/2015%20oxnard%20train%20derailment/big%20data/2016/12/20/feed-google-fresh-sandwiches-every-day-instead-of-christmas-dinner-onceayear.html", | |
"categories" : ["Content Marketing","Organic Search","Search Engine Optimization","2015 Oxnard train derailment","Big Data"], | |
"tags" : ["Content Marketing","Organic Search","Search Engine Optimization","2015 Oxnard train derailment","Big Data"], | |
"authors" : ["Chris Abraham"], | |
"publishedDate" : "2016-12-20 00:00:00 +0000", | |
"content" : "Why do search professionals scatter like roaches when the kitchen light comes on? Why is everyone acting so sneaky all the time? Why do SEO professionals skulk around dark alleyways, offering their search engine services in furtive, hurried whispers? What’s up with that? Don’t we all know that Google Search is a somniloquist! Whenever he is able to catch some shut-eye, a nanosecond at a time, he cries out in his slumber, “feed me . . . feed me Seymour.” Not only is Google a Glutton, but he’s always hungry — and a picky eater, too. In a perfect world, all of Google’s food would be steamy hot, bold with spices and herbs, and nutritionally rich. If you and I don’t constantly develop ways to provide Google with all the taste-sensations, fresh out of the pan, out of the oven, and then beautifully-plated, then Google’ll definitely reheat leftovers — hell, he’ll fish out the meals ready to eat (MREs). But, honestly, Google would always prefer to eat healthy. Quality over quantity. Google would love to get enough fiber, enough vitamins and minerals, enough healthy fat and presentation. The internet webosphere is like greater Washington, DC on a weekday lunchtime: food trucks everywhere! Yes, also restaurants, fast food, fast-casual, brown bags of tuna prepared at home, hot dog and burrito carts, office cantinas, take out places, and by-the-pound buffet joints. Before the age of the food truck, there were some carts offering haute cuisine, but it wasn’t until the rise of the food truck when the entire power structure lunch at least, was set: dirty water dogs, burgers, buffet salad, or sit down restaurant food. The barrier to entry was pretty impossible save for a few rich folks doing it for vanity or experienced folks doing it for shareholder value. And the paperwork, licensing, and all the other food-hoops required. But DC is big, hungry, and wants all the taste-sensations, fresh out of the pan, out of the oven, and then beautifully-plated; and we want our lunch to be delicious, steamy hot, bold with spices and herbs, and nutritionally rich. Because DC’s already hungry, DC’s only somewhat a snob! The majority of folks who work in DC during the work week is balancing between time, price, proximity, healthiness, preference, and deliciousness. And all you need to do is discover what as many of those things are and cook to order. You can feed Google. You can even become Google’s favorite type of food, snack, lunch, sandwich, dessert, cheat, breakfast, dinner, late-night bite. But you, like every great cook, every great chef, cannot just make something awesome once. You don’t need to make the Guinness Book of World Records and then done. SEO is not one-and-done! It’s feeding the newsroom rather than just getting a novel out of you just to have written a novel. I’m a pretty good cook. In fact, I have made some amazing things perfectly actually once (remember that Bûche de Noël I made that one time with the powdered sugar snow, the branches, the ganache and cake?). But Google prefers hot fresh donuts over even my _Bûche de Noël _once it’s a week old. So, stop sneaking around and stop trying to be way fancier than you’re able to provide every single day. Google wants your content food as hungrily as it wants the the President’s latest transcript or the top headlines from the New York Times. But only if it’s at least as fresh, nutritional, and as tasty as the other good stuff around it. I sell web site and branding services for my buddy Mike McDermott of Bash Foo and the vast majority of all your competitors can’t cook at all; and those who can, only cook a couple times a year at the most, give or take a couple years. While the bar is super-low for 99% of your competitors, the bar is nosebleed-high for the remaining 1% who have all that sorted out. Also, since the webinternetosphere is a global market, mostly, that 1% is still a very large number. Google doesn’t think so. Google thinks that it really sucks that only 1% of all online content-providers offer more than complete crap. Those 1% (who are generally the same people who are in the 1% in the real world), the best-of-breed in Google Search, are the same people that Google, in it’s love of the little guy and it’s passion for egalitarianism and equal access based on an impossibly-low barrier to entry, fights hard to disempower. Google wants diversity — your diversity — but Google also knows that the people who search using Google are also impatient, intolerant to junk results, unwilling to suffer ugly, unable to trust a site that is rarely if every updated, unsure about sites that haven’t kept up with technology and design (so many of our websites are the equivalent of shag carpet, orange appliances, avocado green counter tops, old stove, and a tiny ancient fridge with no stainless or granite or backsplash to be seen anywhere! Come up with a content marketing plan that is the equivalent of my simple peasant meal of eggs, chicken, greens, fish, herbs, and spice, and then run with it. Make it every day. Just make sure it’s fresh, it’s honest, it’s make with the best ingredients possible, and you don’t cut corners. Put too much gravy or cream or béarnaise on your dish and maybe that’s an attempt to hide a bunch of flaws. Gilding the lily is almost always a way to give an often deceptively attractive or improved appearance. Cook simply, show your work, make it basic, use good ingredients, plate it lovingly, deliver it quickly (you all need faster sites), and you’ll become Google’s favorite — at least when it comes to the particular fare you’re offering, within your niche. Now, your turn. It’s essential to think of Google as hungry and in need of what you — or anybody — have to contribute (Google’s like Wikipedia that way, but unlike Wikipedia, you’re allowed — encouraged – to create your own page!) So, that box you gladly checked when you finished your website three years ago isn’t a completed task. How dare you! It was just the very first version of a constantly expanding, growing, changing, and living collection of documents. OK, after all of this talk about food, I’m ready to eat — ready, set, Publish! Like this post? Sign up for our emails here. The post Feed Google fresh sandwiches every day instead of Christmas dinner once-a-year appeared first on Biznology." | |
}, | |
{ | |
"id" : "1482105600-agile-20marketing-content-20marketing-internet-20marketing-organic-20search-public-20relations-2016-12-19-how-storytelling-can-shape-the-corporate-brand-and-culture", | |
"site" : "biznology Blog", | |
"title": "How storytelling can shape the corporate brand and culture", | |
"url": "http://localhost:4000/agile%20marketing/content%20marketing/internet%20marketing/organic%20search/public%20relations/2016/12/19/how-storytelling-can-shape-the-corporate-brand-and-culture.html", | |
"categories" : ["Agile Marketing","Content Marketing","Internet Marketing","Organic Search","Public Relations"], | |
"tags" : ["Agile Marketing","Content Marketing","Internet Marketing","Organic Search","Public Relations"], | |
"authors" : ["Jay Gronlund"], | |
"publishedDate" : "2016-12-19 00:00:00 +0000", | |
"content" : "Technology has transformed our world into a data-obsessive circus where information is unbelievably accessible, connectivity is constant, and unpredictable events always surprise and engulf us. Call this extreme clutter and volatility. With so much information and multi-tasking surrounding us, it has become a challenge to restore simplicity, clarity and focus in our communications. These excessive conditions provide the main impetus for the re-emergence of storytelling for inspiring, engaging and connecting to others. Storytelling is ageless and remains the most powerful form of persuasion. Socrates recognized the value of storytelling, so did Aesop, Jesus, Muhammad, Confucius and even Mark Twain. Today the power of storytelling has been scientifically proven: Neuroscientists have shown that the brain was built to wander on average over 1,000 times per day (e.g. including daydreams). They also found that storytelling stops this wandering and engages the listener (they call this “neuro-coupling”). Bruce Perry, an expert on brain development, says that “neural systems fatigue quickly, actually within 4-8 minutes, and become less responsive,” but can be stimulated and sustained by storytelling. Artificial Intelligence specialists have been studying how our brain actually works, especially how we file and store all this information that the brain absorbs every day. They discovered that the brain does not process information in “files” (e.g., like a computer program). As an example it sorts information from a PowerPoint presentation in a way that the first and last items on a list are usually remembered (also any item that has an emotional impact), and the rest is discarded as “trash” and never retrieved. Instead, the brain more effectively files and retrieves information when there is a context, as in the form of a story. Reinforcing this discovery, author and marketing professor Jennifer Aaker from Stanford notes that people remember stories as much as 22 times more than they do facts alone. So what can storytelling do to improve communications, process our changing world, and especially help shape a corporate brand and culture? Vibrant leaders now recognize that storytelling can create an emotional connection, which is the heart of good branding. It engages listeners emotionally, creates empathy, and inspires action. Importantly, neuroscience has also concluded that humans are more likely to make decisions based on emotions, not rational thinking. Our world is changing dramatically and so leaders are more challenged than ever to adapt to such a groundswell of populist trends, technological advances, declining trust in the establishment, globalization, growing uncertainty (particularly with the incoming Trump administration), and fundamentally what a corporation should stand for today. All these changes can affect a corporate brand and culture, so it is incumbent for a CEO to explain and especially inspire support for any updates on its corporate values and strategy. Simply using words and sharing data with customers and employees can be too cerebral and esoteric, but using storytelling to communicate “who we are,” “what we have learned,” and “why we are changing” will be far more captivating and motivating. Storytelling describes a journey and is ideal for meaningful change. For millennials, storytelling represents an ideal form of communication. This “first digital generation” thrives on social media, which is different from traditional media mainly in that it involves a one-on-one conversation that begs for engagement, versus the one-to-many in mass marketing. Many older managers don’t understand or even resent the independent, restless, unpredictable tendencies of millennials. However, millennials represent a huge opportunity for creativity and innovative ideas, so they should not be ignored. They do want to learn and respect experience, albeit sometimes in trying ways, but the key to maximizing their potential is to engage them. And this is what storytelling does. More companies today are using storytelling to recruit and train new employees–Apple, IBM, 3M, Nike, Coca Cola, Disney, Microsoft, NASA, and other forward-thinking organizations. In addition, as social media becomes more mainstream for advertising, they are using storytelling to engage prospective customers in blogs, videos, newsletters, content branding, and other digital communication vehicles. Millennials simply don’t trust traditional advertising–95% rely on feedback from friends for purchase decisions instead and find stories much more credible and trustful for learning about products. Storytelling is also ideal for young entrepreneurs who focus more on cost-efficient digital media and realize that stories about their personal experience can create a strong emotional connection. To update a corporate culture and strengthen a brand, one must learn more about the types of stories that will work, depending of course on the audience and their aspirations, the different situations (e.g., for a new leader, change in direction, new challenges, etc.) and the various nuances for making a story credible, compelling and emotionally engaging. But it all starts with a recognition of the power of storytelling in communications. Like this post? Sign up for our emails here. The post How storytelling can shape the corporate brand and culture appeared first on Biznology." | |
}, | |
{ | |
"id" : "1481846400-drinks-winter-2016-12-16-homemade-irish-cream", | |
"site" : "smitten kitchen Blog", | |
"title": "homemade irish cream", | |
"url": "http://localhost:4000/drinks/winter/2016/12/16/homemade-irish-cream.html", | |
"categories" : ["Drinks","Winter"], | |
"tags" : ["Drinks","Winter"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-12-16 00:00:00 +0000", | |
"content" : "Look, we all have to draw the line somewhere. I have over the years insisted that making some things from scratch were just crazy, best left to others, and one by one come around and worse, as if I’d forgotten my repudiation of five minutes earlier like some sort of toddler, extolled the virtues of doing so. Cases in point: Graham crackers, marshmallows, bagels, dulce de leche, pop tarts, rainbow cookies, goldfish crackers, apple strudel, fully from-scratch hot fudge sundae cakes and Russian honey cakes but if you were to suggest I should make my own yogurt, croissants or sushi, despite the fact that I would be delighted if you made any of these things, doubly so if you brought some to me right now, I would probably rather unpack the last box from our last move (two-plus years ago), not even jokingly labeled “Unfiled Files.” Look, we all have to draw the line somewhere. I mean, what’s next if I cross these lines? Milling my own flours? Smoking my own pork belly? Making our own Bailey’s-style Irish cream? Well, actually: yes. And here I go again: But it was so easy! You could and totally should do this at home! I had heard over the years that you could make this at home easily but — and I think this is the fulcrum on which we balance our yup/nope choices to cook things that amply exist outside our kitchens — I wasn’t unhappy with what I could buy (Bailey’s) so why would I bother? Irish cream has always been a favorite cold-weather indulgence, in or outside coffee. I’ve even made french toast with it. We always have a bottle around. But in the last couple years, I’ve found it almost too sweet to drink and I guess you could say we were on a break. Read more »" | |
}, | |
{ | |
"id" : "1481846400-content-20marketing-digital-20marketing-web-20metrics-business-20advantages-businesses-2016-12-16-content-marketing-data-it-s-not-geeky-and-it-s-not-boring", | |
"site" : "biznology Blog", | |
"title": "Content marketing data It’s not geeky and it’s not boring", | |
"url": "http://localhost:4000/content%20marketing/digital%20marketing/web%20metrics/business%20advantages/businesses/2016/12/16/content-marketing-data-it-s-not-geeky-and-it-s-not-boring.html", | |
"categories" : ["Content Marketing","Digital Marketing","Web Metrics","business advantages","businesses"], | |
"tags" : ["Content Marketing","Digital Marketing","Web Metrics","business advantages","businesses"], | |
"authors" : ["Andrew Schulkind"], | |
"publishedDate" : "2016-12-16 00:00:00 +0000", | |
"content" : "Data gets a bad rap. Too often when we think of data, our eyes glaze over as we envision people with pocket protectors lecturing us on, say, the advantages of Bayesian inference in statistical analysis. Or worse: insurance actuaries helpfully calculating the odds of our death. But even as data is frequently much more accessible than it had been, most marketers still think of it as a necessary evil or something that simply doesn’t help them unless they’re a Fortune 500 CPG firm with huge pools of information to work with. In reality, even small businesses can benefit from a better understanding of how to use the information they’re already gathering. Here’s an example—a data scientist uncovered two interesting facts as he examined data relating to traffic in New York City: He found the absolute worst place to park in NYC He defined when rush hour starts and ends in NYC Not exactly the kind of stuff that TMZ is going to report breathlessly–unless the worst place to park happens to be in front of “Brangelina’s” apartment–but it is certainly interesting if you run any sort of business that has trucks doing deliveries in Manhattan. (Sadly, the data says that rush hour in Manhattan lasts from about 8:30 am to 6:30 pm. In other words, all day.) In other words, data is only boring when it’s not relevant. If you’re in Des Moines (and I’m not knocking Des Moines; I’ve been there and it’s a cool town), you don’t care about parking in NYC. But if you’re shelling out hundreds or thousands of dollars each month in parking fines, then the data here is anything but boring. Here’s a quick exercise to help you put this to good use: what’s the most productive page on your website? In all likelihood, you have a trove of data about traffic on your website. And if you can’t answer me within a few minutes, chances are that’s all you have. Because you haven’t found a way to turn the data into information and the information into insights. So you’re leaving potentially valuable data-driven business advantages on the table. If you don’t know what the most productive page on your site is, how can you devote resources to creating more content on the same topic, or in the same format? How will you know to promote that content via your most productive channels? And while we’re on the subject, what are your most productive channels? Stop thinking of your data as boring. Start thinking of it as a resource to be tapped. This is likely to require some dedication and perhaps even some expert outside help–or someone internally with a blend of analytical and creative skills. Done right you’ll increase your marketing effectiveness and drive down your costs. Because you won’t be wasting time circling the block. You’ll be parked–legally–and already at work helping your clients (By the way, send me your answer on your site’s most productive page. If you include the metrics you used to determine why it’s your best page, I’ll send you a quick report on factors you may not be considering.) Like this post? Sign up for our emails here. The post Content marketing data: It’s not geeky and it’s not boring appeared first on Biznology." | |
}, | |
{ | |
"id" : "1481760000-internet-20marketing-marketing-20automation-web-20metrics-website-20search-ai-20systems-2016-12-15-cognitive-content-strategy", | |
"site" : "biznology Blog", | |
"title": "Cognitive content strategy", | |
"url": "http://localhost:4000/internet%20marketing/marketing%20automation/web%20metrics/website%20search/ai%20systems/2016/12/15/cognitive-content-strategy.html", | |
"categories" : ["Internet Marketing","Marketing Automation","Web Metrics","Website Search","AI systems"], | |
"tags" : ["Internet Marketing","Marketing Automation","Web Metrics","Website Search","AI systems"], | |
"authors" : ["James Mathewson"], | |
"publishedDate" : "2016-12-15 00:00:00 +0000", | |
"content" : "Managers of large sites must balance search AI with navigation IA My job involves building content systems and platforms for the world’s largest private digital publisher—IBM. Most of the people I talk to outside of IBM have no idea of the size and scale of our publishing efforts. Here is an example of a Twitter conversation related to the topic: The goal is no different for small sites and large sites: you need to help your users find the content that answers their questions in the fewest number of clicks. But as a site scales up, the level of difficulty in marking up and otherwise tagging content for findability increases exponentially. At a certain point, tagging alone can make findability worse than semantic search without tagging at all. Incorrectly tagged content surfaces irrelevant experiences to users. When you get over 100 products at perhaps 100 pages per (counting all your user documentation), you reach a point of diminishing returns on marking up content for search engines and navigation systems. Humans are very error-prone when they tag things, especially if multiple tags apply to one piece of content. Taxonomies grow beyond human abilities to manage and use them manually. There are all kinds of ambiguities between related tags. And most companies change their product names or introduce new products faster than the taxonomy managers can keep up. The answer to these problems is to give humans augmented intelligence (AI) to develop better taxonomies, to do a better job with content markup, and to find relevant content even when it is not well-tagged. That’s what this article is about. Cognitive search: The antidote to poor tagging A decade ago, I was a taxonomist and content strategist frustrated by these issues. At a certain point, I reasoned that a good search engine would help users find the most relevant content on our huge site with less effort than navigation. Our data showed that users preferred search to navigating through our megamenu of options. Besides the problems with choosing the right options for the menu, there are human limitations to megamenus. Humans don’t like long lists of things. Our brains get easily overwhelmed and, as cognitive misers, we prefer fewer options, shorter sentences and paragraphs, etc. The psychological research on this is uncontroversial: humans can only parse seven plus or minus two things at a time—words in a sentence, sentences in a paragraph, menu items, etc. When you have a megamenu with dozens or even hundreds of items, you overwhelm users with options. Their only recourse is to use search. Hence my tweet response above. The only refuge of companies with such large sites is to build internal search engines that automatically find content users are looking for. This view is controversial in our field because of two prevalent myths: Users hate search. I have attended several conference talks where the speaker says, “if you force your users to search, you failed them.” I honestly don’t know where this myth came from, but it is wrong. If users hate search so much, why do they love Google so much? The answers are: users hate crummy search experiences. But they love good ones. Search is just about keywords. Sure, keywords are important, but not as exact-match entities. Keywords are important because of their meaning. If you try to build a search engine that just matches query strings to the frequency, density, or prominence of keywords in content, you will fail. This is essentially what Lucene does, and it is woefully inadequate—by itself—for large sites. What you need is semantic search, which doesn’t just match queries to keyword strings, it matches content to the the searcher’s intent implicit in the query. Semantic search engines use natural language processing and machine-learning to find the content most relevant to a searcher’s intent. At least that is the goal. If you had such a system on your site, users would use search most of the time. Most sites struggle to offer anything close to the ideal. On average, even for large sites, about 10% of visits involve search. The rest of your users expend huge cognitive effort digging through megamenus to find content that might answer their questions or solve their problems. Or they go back to Google, and maybe they click on your search results if you happen to rank against your competitors. If their initial experience with your search and navigation is poor, they prefer to try your competitors’ sites first. For these reasons, building a semantic search experience should be job number one for managers of large sites. But how do you get started? Start by looking at semantic search engines. There are several commercial ones available. But merely installing a semantic search engine isn’t enough. It’s not a magic bullet. All AI systems must be trained. Chances are, your content writers don’t use conventional meanings of common words in your industry. More likely, different units within your company use the same terms differently. Add this complexity to your changing brand names and product portfolios, and the challenge is increasingly about training your search engine on your content. Machine-learning in site search involves developing a feedback loop that elevates results that users click and engage with and pushes results down that have low clicks and high bounce rates. For your most common queries, the feedback loop is the best approach because you likely have multiple pages that could satisfy the user intent of the query. But for uncommon queries, or so-called “long tail” queries, you might only have one good page that answers the question implicit in the query. For these, you need a system that tags the pages to help the search engine recognize their relevance to longer queries. Even though these queries might only occur once a month, there are so many of them, they typically constitute the majority of your user queries. Cognitive content markup While you need to train the search engine on your content—especially for common queries—you also need to train your content to help out your search engine, especially for uncommon queries. Cognitive search doesn’t eliminate the need to tag your content, it just makes it less urgent to do so. While you are marking up all your long-tail content, at least you can serve relevant content to your users for their common queries. For everything else, there is no substitute for good tagging. But how do you tag content well considering how error prone human taggers and taxonomists are? The same AI systems that help you serve relevant content to your users can help your taxonomists build better controlled vocabularies. They can also help content practitioners mark up your content more accurately. At IBM, we use Watson Explorer for both, but there are other cognitive technologies that can analyze massive quantities of content and extract the most useful taxonomic values from them. You can try the AlchemyAPI, for example, to extract the most common values on a large site. Then you can use classifiers to build taxonomies from the term extraction. Finally, you can test and iterate on the values you use to make finer adjustments, using human experts to validate relevance. If all this talk of classification makes your head swim, don’t worry. You’re not alone. But it is not as difficult as it seems. Anyone who has studied biology knows that a big part of that field is classifying animals and plants into genus and species, etc. The process is very similar for content classification. You have buyers and buyer journey stages and discrete actions in each stage. At each stage in the journey, different types of content work better because they suit the purpose. If you model your content classification according to your buyer journeys, you’ve made a good start in helping search and navigation systems serve the right content to your target audiences, aligned to their user intent. Like this post? Sign up for our emails here. The post Cognitive content strategy appeared first on Biznology." | |
}, | |
{ | |
"id" : "1481673600-webhooks-announcements-asp-net-20mvc-asp-net-20web-20api-asp-net-20webhooks-2016-12-14-announcing-microsoft-aspnet-webhooks-v1", | |
"site" : "MSDN blog", | |
"title": "Announcing Microsoft ASPNET WebHooks V1", | |
"url": "http://localhost:4000/webhooks/announcements/asp.net%20mvc/asp.net%20web%20api/asp.net%20webhooks/2016/12/14/announcing-microsoft-aspnet-webhooks-v1.html", | |
"categories" : ["WebHooks","Announcements","ASP.NET MVC","ASP.NET Web Api","ASP.NET WebHooks"], | |
"tags" : ["WebHooks","Announcements","ASP.NET MVC","ASP.NET Web Api","ASP.NET WebHooks"], | |
"authors" : ["Henrik F Nielsen"], | |
"publishedDate" : "2016-12-14 00:00:00 +0000", | |
"content" : "We are very happy to announce ASP.NET WebHooks V1 making it easy to both send and receive WebHooks with ASP.NET. WebHooks provide a simple pub/sub model for wiring together Web APIs and services with your code. A WebHook can be used to get notified when a file has changed in Dropbox, a code change has been committed to GitHub, a payment has been initiated in PayPal, a card has been created in Trello, and much more — the possibilities are endless! When subscribing, you provide a callback URI where you want to be notified. When an event occurs, an HTTP POST request is sent to your callback URI with information about what happened so that your Web app can act accordingly. WebHooks happen without polling and with no need to hold open a network connection while waiting for notifications. Because of their simplicity, WebHooks are already exposed by most popular services and Web APIs. To help managing WebHooks, Microsoft ASP.NET WebHooks makes it easier to both send and receive WebHooks as part of your ASP.NET application: On the receiving side, it provides a common model for receiving and processing WebHooks from any number of WebHook providers. It comes out of the box with support for Azure Alerts, BitBucket, Dropbox, Dynamics CRM, GitHub, Kudu, Instagram, MailChimp, MyGet, PayPal, Pusher, Salesforce, Slack, Stripe, Trello, Visual Studio Team Services, WordPress, and Zendesk as well as IFTTT and Zapier, but it is easy to add support for more. On the sending side, it provides support for generating WebHooks as a result of changes in your service. It helps managing and storing subscriptions as well as sending event notifications to the right set of subscribers. This allows you to define your own set of events that users can subscribe to. ASP.NET WebHooks provides a lot of flexibility for sending and persisting WebHooks, scaling your solution up and out, as well as sending WebHooks from WebJobs and other places in addition to your Web Application. The two parts can be used together or apart depending on your scenario. If you only need to receive WebHooks from other services, then you can use just the receiver part; if you only want to expose WebHooks for others to consume, then you can do just that. In addition to hosting your own WebHook server, ASP.NET WebHooks are part of Azure Functions where you can process WebHooks without hosting or managing your own server! You can even go further and host an Azure Bot Service using Microsoft Bot Framework for writing cool bots talking to your customers! The WebHook code targets ASP.NET Web API 2 and ASP.NET MVC 5, is available as Open Source on GitHub, and as Nuget packages. A port to the ASP.NET Core is being planned so please stay tuned! Receiving WebHooks Dealing with WebHooks depends on who the sender is. Sometimes there are additional steps registering a WebHook verifying that the subscriber is really listening. Often the security model varies quite a bit. Some WebHooks provide a push-to-pull model where the HTTP POST request only contains a reference to the event information which is then to be retrieved independently. The purpose of Microsoft ASP.NET WebHooks is to make it both simpler and more consistent to wire up your API without spending a lot of time figuring out how to handle any WebHook variant: A WebHook handler is where you process the incoming WebHook. Here is a sample handler illustrating the basic model. No registration is necessary – it will automatically get picked up and called: public class MyHandler : WebHookHandler { // The ExecuteAsync method is where to process the WebHook data regardless of receiver public override Task ExecuteAsync(string receiver, WebHookHandlerContext context) { // Get the event type string action = context.Actions.First(); // Extract the WebHook data as JSON or any other type as you wish JObject data = context.GetDataOrDefault(); return Task.FromResult(true); } }</pre> </div> Finally, we want to ensure that we only receive HTTP requests from the intended party. Most WebHook providers use a shared secret which is created as part of subscribing for events. The receiver uses this shared secret to validate that the request comes from the intended party. It can be provided by setting an application setting in the Web.config file, or better yet, configured through the Azure portal or even retrieved from Azure Key Vault**.** For more information about receiving WebHooks and [lots of samples](https://github.com/aspnet/WebHooks/tree/master/samples), please see these resources: * [Sending and Receiving WebHooks triggered by workflows and custom workflow activities from Microsoft Dynamics CRM](http://blogs.msdn.com/b/crm/archive/2016/01/15/sending-webhooks-with-microsoft-dynamics-crm.aspx). * [Subscribing to Instagram listening for media posted within a given geo-location](http://blogs.msdn.com/b/webdev/archive/2015/09/21/integrating-with-instagram-using-asp-net-webhooks-preview.aspx) and [associated sample](https://github.com/aspnet/WebHooks/tree/master/samples/InstagramReceiver). * [Subscribing to new and updated leads and opportunities from Salesforce](http://blogs.msdn.com/b/webdev/archive/2015/09/07/integrating-with-salesforce-using-asp-net-webhooks-preview.aspx). * [Subscribing to Slack WebHooks](http://blogs.msdn.com/b/webdev/archive/2015/09/06/receiving-slack-webhooks-with-asp-net-webhooks.aspx) and [Using Slack Slash Commands](https://blogs.msdn.microsoft.com/webdev/2016/02/14/asp-net-webhooks-and-slack-slash-commands/) enabling rich commands with structured data, images, and more; see [associated sample](https://github.com/aspnet/WebHooks/tree/master/samples/SlackReceiver). * [Integrating with IFTTT and Zapier to Monitor Twitter and Google Sheets](http://blogs.msdn.com/b/webdev/archive/2015/11/21/using-asp-net-webhooks-with-ifttt-and-zapier-to-monitor-twitter-and-google-sheets.aspx) and [associated sample](https://github.com/aspnet/WebHooks/tree/master/samples/GenericReceivers). * [Receiving WebHooks from Azure Alerts and Kudu (Azure Web App Deployment)](http://blogs.msdn.com/b/webdev/archive/2015/10/04/receive-webhooks-from-azure-alerts-and-kudu-azure-web-app-deployment.aspx) and [associated sample](https://github.com/aspnet/WebHooks/tree/master/samples/AzureReceivers). * [Sample building a Bitbucket WebHooks receiver](https://github.com/aspnet/WebHooks/tree/master/samples/BitbucketReceiver). * [Sample building a Stripe WebHooks receiver](https://github.com/aspnet/WebHooks/tree/master/samples/StripeReceiver). ### Sending WebHooks Sending WebHooks is slightly more involved in that there are more things to keep track of. To support other APIs registering for WebHooks from your ASP.NET application, we need to provide support for: * Exposing which events subscribers can subscribe to, for example _Item Created_ and _Item Deleted_; * Managing subscribers and their registered WebHooks which includes persisting them so that they don’t disappear; * Handling per-user events in the system and determine which WebHooks should get fired so that WebHooks go to the correct receivers. For example, if user _A_ caused an _Item Created_ event to fire then determine which WebHooks registered by user _A_ should be sent. We don’t want events for user _A_ to be sent to user _B_. * Sending WebHooks to receivers with matching WebHook registrations. As described in the blog [Sending WebHooks with ASP.NET WebHooks Preview](http://blogs.msdn.com/b/webdev/archive/2015/09/15/sending-webhooks-with-asp-net-webhooks-preview.aspx), the basic model for sending WebHooks works as illustrated in this diagram: [](https://msdnshared.blob.core.windows.net/media/MSDNBlogsFS/prod.evol.blogs.msdn.com/CommunityServer.Blogs.Components.WeblogFiles/00/00/00/63/56/metablogapi/2678.WebHooksSender_049B8B41.png) Here we have a regular Web site (for example deployed in Azure) with support for registering WebHooks. WebHooks are typically triggered as a result of incoming HTTP requests through an MVC controller or a WebAPI controller. The orange blocks are the core abstractions provided by ASP.NET WebHooks: 1. [IWebHookStore](https://github.com/aspnet/WebHooks/blob/master/src/Microsoft.AspNet.WebHooks.Custom/WebHooks/IWebHookStore.cs): An abstraction for storing WebHook registrations persistently. Out of the box we provide support for [Azure Table Storage](http://blogs.msdn.com/b/webdev/archive/2015/09/15/sending-webhooks-with-asp-net-webhooks-preview.aspx) and [SQL](http://blogs.msdn.com/b/webdev/archive/2015/11/07/updates-to-microsoft-asp-net-webhooks-preview.aspx) but the list is open-ended. 2. [IWebHookManager](https://github.com/aspnet/WebHooks/blob/master/src/Microsoft.AspNet.WebHooks.Custom/WebHooks/IWebHookManager.cs): An abstraction for determining which WebHooks should be sent as a result of an event notification being generated. The manager can match event notifications with registered WebHooks as well as applying filters. 3. [IWebHookSender](https://github.com/aspnet/WebHooks/blob/master/src/Microsoft.AspNet.WebHooks.Custom/WebHooks/IWebHookSender.cs): An abstraction for sending WebHooks determining the retry policy and error handling as well as the actual shape of the WebHook HTTP requests. Out of the box we provide support for immediate transmission of WebHooks as well as a queuing model which can be used for scaling up and out, see the blog [New Year Updates to ASP.NET WebHooks Preview](http://blogs.msdn.com/b/webdev/archive/2015/12/31/new-year-updates-to-asp-net-webhooks-preview.aspx) for details. The registration process can happen through any number of mechanisms as well. Out of the box we support registering WebHooks through a REST API but you can also build registration support as an MVC controller or anything else you like. It’s also possible to generate WebHooks from inside a [WebJob](http://www.hanselman.com/blog/IntroducingWindowsAzureWebJobs.aspx). This enables you to send WebHooks not just as a result of incoming HTTP requests but also as a result of messages being sent on a queue, a blob being created, or anything else that can trigger a WebJob: [](https://msdnshared.blob.core.windows.net/media/MSDNBlogsFS/prod.evol.blogs.msdn.com/CommunityServer.Blogs.Components.WeblogFiles/00/00/00/63/56/metablogapi/8080.WebHooksWebJobsSender_033BC82A.png) The following resources provide details about building support for sending WebHooks [as well as samples](https://github.com/aspnet/WebHooks/tree/master/samples): * [Sending WebHooks with ASP.NET WebHooks Preview](http://blogs.msdn.com/b/webdev/archive/2015/09/15/sending-webhooks-with-asp-net-webhooks-preview.aspx) describes the basic model for handling WebHook subscriptions, generating event notifications, and for receiving such WebHooks. Also check out the associated [sample of a basic Web Application sending custom WebHooks](https://github.com/aspnet/WebHooks/tree/master/samples/CustomSender) and a [sample receiving custom WebHooks](https://github.com/aspnet/WebHooks/tree/master/samples/CustomReceiver). * The blog [New Year Updates to ASP.NET WebHooks Preview Dec 2015](http://blogs.msdn.com/b/webdev/archive/2015/12/31/new-year-updates-to-asp-net-webhooks-preview.aspx) goes into details for how to send events to _all_ users and how to scale up and out your solution using persistent queues. There is also a [sample scaling out WebHooks by sending them to a queue](https://github.com/aspnet/WebHooks/tree/master/samples/CustomSender.QueuedSender). * The blog [Sending ASP.NET WebHooks from Azure WebJobs](http://blogs.msdn.com/b/webdev/archive/2016/01/31/sending-asp-net-webhooks-from-azure-webjobs.aspx) describes how to send WebHooks from WebJobs, which enable you to generate WebHooks triggered by [a number of sources](https://azure.microsoft.com/en-us/documentation/articles/websites-webjobs-resources/) including queues, blogs, etc. See also the associated [sample sending WebHooks from a WebJob](https://github.com/aspnet/WebHooks/tree/master/samples/CustomSender.WebJob). * The blog [Updates to Microsoft ASP.NET WebHooks Preview Nov 2015](http://blogs.msdn.com/b/webdev/archive/2015/11/07/updates-to-microsoft-asp-net-webhooks-preview.aspx) describes how to store WebHook registrations in SQL and how to register WebHook modules with a dependency engine. Thanks to all the feedback and comments throughout the development process, it is very much appreciated! Have fun! Henrik" | |
}, | |
{ | |
"id" : "1481500800-uncategorized-2016-12-12-new-updates-to-web-tools-in-visual-studio-2017-rc", | |
"site" : "MSDN blog", | |
"title": "New Updates to Web Tools in Visual Studio 2017 RC", | |
"url": "http://localhost:4000/uncategorized/2016/12/12/new-updates-to-web-tools-in-visual-studio-2017-rc.html", | |
"categories" : ["Uncategorized"], | |
"tags" : ["Uncategorized"], | |
"authors" : ["Daniel Roth"], | |
"publishedDate" : "2016-12-12 00:00:00 +0000", | |
"content" : "Update 12/15: There was a bug in the Visual Studio 2017 installer that shipped between 12/12 and 12/14, that if you updated a prior RC installation it uninstalled IIS Express, Web Deploy, and LocalDB. The fix is to manually re-install IIS Express, Web Deploy, and LocalDB. We shipped an updated installer on 12/15 that fixed this issue, so if you updated Visual Studio 2017 RC on 12/15 or later you will not be affected. For details see our known issues page. Today we announced an update to Visual Studio 2017 RC that includes a variety of improvements for both ASP.NET and ASP.NET Core projects. If you’ve already installed Visual Studio 2017 RC then these updates will be pushed to you automatically. Otherwise, simply install Visual Studio 2017 RC and you will get the latest updates. Below is a summary of the improvements to the Web tools in this release: The ability to turn off script debugging for Chrome and Internet Explorer if you prefer to use the in-browser tools. To do this, go to Debug -> Options, and uncheck “Enable JavaScript debugging for ASP.NET (Chrome and IE)”. Bower packages now restore correctly without any manual workarounds required. General stability improvements for ASP.NET Core applications, including: Usability and stability improvements for creating ASP.NET Core apps with Docker containers. Most notably we’ve fixed the issue that when provisioning a app in Azure App Service, new resource groups no longer need to be created in the same region as the App Service plan. Entity Framework Core commands such as Add-Migration, and Update-Database can be invoked from the NuGet Package Manager Console. ASP.NET Core applications now work with Windows Authentication. Lots of improvements to the .NET Core tooling. For complete details see the .NET team blog post. Thanks for trying out this latest update of Visual Studio 2017! For an up to date list of known issues see our GitHub page, and keep the feedback coming by reporting any issues using the built-in feedback tools." | |
}, | |
{ | |
"id" : "1481241600-appetizers-20-20party-20snacks-photo-quick-snack-2016-12-09-union-square-cafe-s-bar-nuts", | |
"site" : "smitten kitchen Blog", | |
"title": "union square cafe’s bar nuts", | |
"url": "http://localhost:4000/appetizers%20+%20party%20snacks/photo/quick/snack/2016/12/09/union-square-cafe-s-bar-nuts.html", | |
"categories" : ["Appetizers + Party Snacks","Photo","Quick","Snack"], | |
"tags" : ["Appetizers + Party Snacks","Photo","Quick","Snack"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-12-09 00:00:00 +0000", | |
"content" : "Four years ago, when I was home for a couple days between book tour stops and I had about 3 gazillion errands to run but I was also hungry (because proper meals are the first thing to go when I’m busy) and really craving a great salad (because vegetables are the first thing to get stiffed when you travel a lot) and I didn’t want to eat it out of a takeout container or on my lap or in a hurry, I wanted to sit down and eat it off a plate like a civilized person with water in a glass, not a plastic bottle, and the want for this was overwhelming and I looked up and I was right in front of the Union Square Cafe and thought, “Why not?” Read more »" | |
}, | |
{ | |
"id" : "1480982400-uncategorized-communitystandup-2016-12-06-notes-from-the-aspnet-community-standup-november-29-2016", | |
"site" : "MSDN blog", | |
"title": "Notes from the ASPNET Community Standup – November 29 2016", | |
"url": "http://localhost:4000/uncategorized/communitystandup/2016/12/06/notes-from-the-aspnet-community-standup-november-29-2016.html", | |
"categories" : ["Uncategorized","CommunityStandup"], | |
"tags" : ["Uncategorized","CommunityStandup"], | |
"authors" : ["Maria Naggaga"], | |
"publishedDate" : "2016-12-06 00:00:00 +0000", | |
"content" : "This is the next in a series of blog posts that will cover the topics discussed in the ASP.NET Community Standup. The community standup is a short video-based discussion with some of the leaders of the ASP.NET development teams covering the accomplishments of the team on the new ASP.NET Core framework over the previous week. Join Scott Hanselman, Damian Edwards, Jon Galloway(Jon’s in Russia this week)and an occasional guest or two discuss new features and ask for feedback on important decisions being made by the ASP.NET development teams. Each week the standup is hosted live on Google Hangouts and the team publishes the recorded video of their discussion to YouTube for later reference. The guys answer your questions LIVE and unfiltered. This is your chance to ask about the why and what of ASP.NET! Join them each Tuesday on live.asp.net where the meeting’s schedule is posted and hosted. ASP.NET Community Standup 11/29/2016 Quick Note: Jon’s in Russia this week so, we don’t have any community links this week. Question and Answers This week Damian and Scott jumped right into question. Damian had a question on Hanselman’s post “Publishing ASP.NET Core 1.1 applications to Azure using git deploy“. Damian’s Question: “How did you create a project without a global.json? …. In Visual Studio today the project always includes a global.json… did you create it on a Mac? ” — Scott: “dotnet new in the command line.” Damian went on to explain the difference between a new application created using the dotnet cli and one created in Visual Studio. When a .NET Core project is created using the dotnet new templates, it does not come with solution level files like global.json. [](https://msdnshared.blob.core.windows.net/media/2016/12/dotnetnew.png) dotnet new template files [](https://msdnshared.blob.core.windows.net/media/2016/12/vs.png) Visual Studio 2015 template with global.json Today global.json is how you set the version of the .NET Core SDK needed for your application. Remember that unless you specify the version SDK, .NET Core will use the latest one on your machine and your app will not work. If you find yourself in a similar scenario to the one mentioned this how you fix it. Find out what version of SDK you have locally. Add global.json to your project and include the appropriate version of the SDK. Check out Hanselman’s post “Publishing ASP.NET Core 1.1 applications to Azure using git deploy” for more information on the above. Question: What are we doing to simplify the Docker versioning numbers? — Now, that we have release 1.0 and 1.1 we can make a fair assessment of how well the versioning strategy is working. Based on those experiences we are going to make some adjustments. Question: Why isn’t ASP.NET Core 1.1 backward compatible? I have a lot of 1.0 libraries. — The intent is that with minor releases like from 1.0 to 1.1 of any package or component shouldn’t break stuff. However, the support matrix for the .NET Core is you can’t mix current components with LTS components. For example you can use ASP.NET Core hosting 1.0 with MVC for ASP.NET Core 1.1 See you at our next community standup!" | |
}, | |
{ | |
"id" : "1480896000-uncategorized-2016-12-05-introducing-the-aspnet-async-outputcache-module", | |
"site" : "MSDN blog", | |
"title": "Introducing the ASPNet Async OutputCache Module", | |
"url": "http://localhost:4000/uncategorized/2016/12/05/introducing-the-aspnet-async-outputcache-module.html", | |
"categories" : ["Uncategorized"], | |
"tags" : ["Uncategorized"], | |
"authors" : ["lanlanlee2008"], | |
"publishedDate" : "2016-12-05 00:00:00 +0000", | |
"content" : "OutputCacheModule is ASP.NET’s default handler for storing the generated output of pages, controls, and HTTP responses. This content can then be reused when appropriate to improve performance. Prior to the .NET Framework 4.6.2, the OutputCache Module did not support async read/write to the storage. Starting with the .NET Framework 4.6.2 release, we introduced a new OutputCache async provider abstract class named OutputCacheProviderAsync, which defines interfaces for an async OutputCache provider to enable asynchronous access to a shared OutputCache. The Async OutputCache Module that supports those interfaces is released as a NuGet package, which you can install to any 4.6.2+ web applications. Benefits of the Async OutputCache Module It’s all about the scalability. The cloud makes it really easy to scale-out computing resources to serve the large spikes in service requests to an application. When you consider the scalability of an OutputCache, you can not use an in-memory provider because the in-memory provider does not allow you to share data across multiple web servers. You will need to store OutputCache data in another storage medium such as Microsoft Azure SQL Database, NoSQL, or Redis Cache. Currently, the OutputCache interaction with these storage mediums is restricted to run synchronously. With this update, the new async OutputCache module enables you to read and write data from these storage providers asynchronously. Async I/O operations help release threads quicker than synchronous I/O operations, which allows ASP.NET to handle other requests. If you are interested in more details about programming asynchronously and the use of the async and await keywords, you can read Stephen Cleary’s excellent article on [Async Programming : Introduction to Async/Await on ASP.NET](https://msdn.microsoft.com/en-us/magazine/dn802603.aspx). How to use the Async OutputCache Module Target your application to 4.6.2+. The OutputCacheProviderAsync interface was introduced in .NET Framework 4.6.2, therefore you need to target your application to .NET Framework 4.6.2 or above in order to use the Async OutputCache Module. Download the .NET Framework 4.6.2 Developer Pack if you do not have it installed yet and update your application’s web.config targetFrameworks attributes as demonstrated below: Add the Microsoft.AspNet.OutputCache.OutputCacheModuleAsync NuGet package. Use the NuGet package manager to install the Microsoft.AspNet.OutputCache.OutputCacheModuleAsync package. This will add a reference to the Microsoft.AspNet.OutputCache.OutputCacheModuleAsync.dll and add the following configuration into the web.config file. Now your applications will start using Async OutputCache Module. If no outputcacheprovider is specified in web.config, the module will use a default synchronous in-memory provider, with that you won’t get the async benefits. We have not yet released an Async OutputCache provider, but plan to in the near future. Let’s take a look at how you can implement an async OutputCache Provider of your own. How to implement an async OutputCache Provider An async OutputCache Provider just needs to implement the OutputCacheProviderAsync interface. More specifically, the async provider should implement the following 8 APIs. [Add(String, Object, DateTime)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider.add(v=vs.110).aspx) Inserts the specified entry into the output cache. (Inherited from [OutputCacheProvider](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider(v=vs.110).aspx).) [AddAsync(String, Object, DateTime)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheproviderasync.addasync(v=vs.110).aspx) Asynchronously inserts the specified entry into the output cache. [Get(String)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider.get(v=vs.110).aspx) Returns a reference to the specified entry in the output cache.(Inherited from [OutputCacheProvider](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider(v=vs.110).aspx).) [GetAsync(String)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheproviderasync.getasync(v=vs.110).aspx) Asynchronously returns a reference to the specified entry in the output cache. [Remove(String)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider.remove(v=vs.110).aspx) Removes the specified entry from the output cache.(Inherited from [OutputCacheProvider](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider(v=vs.110).aspx).) [RemoveAsync(String)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheproviderasync.removeasync(v=vs.110).aspx) Asynchronously removes the specified entry from the output cache. [Set(String, Object, DateTime)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider.set(v=vs.110).aspx) Inserts the specified entry into the output cache, overwriting the entry if it is already cached.(Inherited from [OutputCacheProvider](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheprovider(v=vs.110).aspx).) [SetAsync(String, Object, DateTime)](https://msdn.microsoft.com/en-us/library/system.web.caching.outputcacheproviderasync.setasync(v=vs.110).aspx) Asynchronously Inserts the specified entry into the output cache, overwriting the entry if it is already cached. If you want your provider to support Cache Dependency and callback functionality, you will need to implement the interface ICacheDependencyHandler, which is defined within the Microsoft.AspNet.OutputCache.OutputCacheModuleAsync.dll. You can add this reference by installing the same NuGet package referenced in our web project. The current version of the Async OutputCache Module does not support Registry Key nor SQL dependencies. Depending on the feedback we hear, we may consider adding them in the future. Once you have finished implementing your provider class, you can use it in a web application by adding a reference to your library and adding the following configurations into the web.config file: That should work! If you need some help to get started, here is an example of an in-memory Async OutputCache Provider as a proof of concept. You can see that it has implemented all the APIs needed and is ready to plug in and use. Summary To wrap up things we have talked about: we have released an async version of the OutputCache Module which allows ASP.NET to take advantage of modern async techniques to help scale your OutputCache. With this new interface, you can now write your own async version of OutputCache providers easily. We encourage you to try this module and extend the your current OutputCache provider to any storage medium that supports async interactions. We also encourage you to share the providers you wrote on NuGet.org and let us know about them in the comments area below. Good luck and happy coding! If you have any questions or suggestions, please feel free to reach out to us by leaving your comments here." | |
}, | |
{ | |
"id" : "1480896000-candy-chocolate-gift-20guides-photo-2016-12-05-chocolate-caramel-crunch-almonds-new-kitchen-favorites", | |
"site" : "smitten kitchen Blog", | |
"title": "chocolate caramel crunch almonds new kitchen favorites", | |
"url": "http://localhost:4000/candy/chocolate/gift%20guides/photo/2016/12/05/chocolate-caramel-crunch-almonds-new-kitchen-favorites.html", | |
"categories" : ["Candy","Chocolate","Gift Guides","Photo"], | |
"tags" : ["Candy","Chocolate","Gift Guides","Photo"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-12-05 00:00:00 +0000", | |
"content" : "Mostly because I have little interest in telling you how to part with your hard-earned money, this isn’t a gift guide. However, ahem, I do purchase a few kitchen-related items each year and thought I’d mention some of the standouts from 2016. [Here’s 2015’s list, all still in heavy rotation.] Most are remarkably basic, either because I had necessities to replace (cough_clumsy_) but a lot are simple just because I’m incredibly stubborn and it really has taken me this long to buy a second set of measuring cups and spoons, some aprons and a coffee-making apparatus. Not all of these may pack up well in boxes with ribbon — well, except that deliciousness at the end, of course — but I can promise you that they’re getting a lot of mileage in a heavy-use kitchen, and as always, I bought them myself. Read more »" | |
}, | |
{ | |
"id" : "1480550400-azure-2016-12-01-visual-studio-tools-for-azure-functions", | |
"site" : "MSDN blog", | |
"title": "Visual Studio Tools for Azure Functions", | |
"url": "http://localhost:4000/azure/2016/12/01/visual-studio-tools-for-azure-functions.html", | |
"categories" : ["Azure"], | |
"tags" : ["Azure"], | |
"authors" : ["Andrew B Hall - MSFT"], | |
"publishedDate" : "2016-12-01 00:00:00 +0000", | |
"content" : "Update 12-6-16 @5:00 PM: Updated version of the tools are available that fix the ability to open .NET Core projects with Azure Functions tools installed. Install the updated version over your old version to fix the issue, there is no need to uninstall the previous copy. Today we are pleased to announce a preview of tools for building Azure Functions for Visual Studio 2015. Azure Functions provide event-based serverless computing that make it easy to develop and scale your application, paying only for the resources your code consumes during execution. This preview offers the ability to create a function project in Visual Studio, add functions using any supported language, run them locally, and publish them to Azure. Additionally, C# functions support both local and remote debugging. In this post, I’ll walk you through using the tools by creating a C# function, covering some important concepts along the way. Then, once we’ve seen the tools in action I’ll cover some known limitations we currently have. Also, please take a minute and let us know who you are so we can follow up and see how the tools are working. Getting Started Before we dive in, there are a few things to note: These tools are offered as a preview release and will have some rough spots and limitations They currently only work with Visual Studio 2015 Update 3 with “Microsoft Web Developer Tools” installed You must have Azure 2.9.6 .NET SDK installed Download and install Visual Studio Tools for Azure Functions For our sample function, we’ll create a C# function that is triggered when a message is published into a storage Queue, reverses it, and stores both the original and reversed strings in Table storage. To create a function, go to: File -> New Project Then select the “Cloud” node under the “Visual C#” section and choose the “Azure Functions (Preview) project type This will give us an empty function project. There are a few things to note about the structure of the project: appsettings.json is where we’ll store configuration information such as connection strings It is recommended that you exclude this file from source control so you don’t check in your developer secrets. host.json enables us to configure the behavior of the Azure Functions host For the purposes of this blog post, we’ll add an entry that speeds up the queue polling interval from the default of once a minute to once a second by setting the “maxPollingInterval” in the host.json (value is in ms) Next, we’ll add a function to the project, by right clicking on the project in Solution Explorer, choose “Add” and then “New Azure Function” This will bring up the New Azure Function dialog which enables us to create a function using any language supported by Azure Functions For the purposes of this post we’ll create a “QueueTrigger – C#” function, fill in the “Queue name” field, “Storage account connection” (this is the name of the key for the setting we’ll store in “appsettings.json”), and the “Name” of our function. Note: All function types except HTTP triggers require a storage connection or you will receive an error at run time This will create a new folder in the project with the name of our function with the following key files: function.json: contains the configuration data for the function (including the information we specified as part of creating the new function) project.json (C#): is where we’ll specify any NuGet dependencies our function may have. Note: Azure functions automatically import some namespaces and assemblies (e.g. Json.NET). run.csx: this contains the body of the function that will be executed when triggered The last thing we need to do in order to hook up function to our storage Queue is provide the connecting string in the appsettings.json file (in this case by setting the value of “AzureWebJobsStorage”) Next we’ll edit the “function.json” file to add two bindings, one that gives us the ability to read from the table we’ll be pushing to, and another that gives us the ability to write entries to the table Finally, we’ll write our function logic in the run.csx file Running the function locally works like any other project in Visual Studio, Ctrl + F5 starts it without debugging, and F5 (or the Start/Play button on the toolbar) launches it with debugging. Note: Debugging currently only works for C# functions. Let’s hit F5 to debug the function. The first time we run the function, we’ll be prompted to install the Azure Functions CLI (command line) tools. Click “Yes” and wait for them to install, our function app is now running locally. We’ll see a command prompt with some messages from the Azure Functions CLI pop up, if there were any compilation problems, this is where the messages would appear since functions are dynamically compiled by the CLI tools at runtime. We now need to manually trigger our function by pushing a message into the queue with Azure Storage Explorer. This will cause the function to execute and hit our breakpoint in Visual Studio. Publishing to Azure Now that we’ve tested the function locally, we’re ready to publish our function to Azure. To do this right click on the project and choose “Publish…”, then choose “Microsoft Azure App Service” as the publish target Next, you can either pick an existing app, or create a new one. We’ll create a new one by clicking the “New…” button on the right side of the dialog This will pop up the provisioning dialog that lets us choose or setup the Azure environment (we can customize the names or choose existing assets). These are: Function App Name: the name of the function app, this must be unique Subscription: the Azure subscription to use Resource Group: what resource group the to add the Function App to App Service Plan: What app service plan you want to run the function on. For complete information read about hosting plans, but it’s important to note that if you choose an existing App Service plan you will need to set the plan to “always on” or your functions won’t always trigger (Visual Studio automatically sets this if you create the plan from Visual Studio) Now we’re ready to provision (create) all of the assets in Azure. Note: that the “Validate Connection” button does not work in this preview for Azure Functions Once provisioning is complete, click “Publish” to publish the Function to Azure. We now have a publish profile which means all future publishes will skip the provisioning steps Note: If you publish to a Consumption plan, there is currently a bug where new triggers that you define (other than HTTP) will not be registered in Azure, which can cause your functions not to trigger correctly. To work around this, open your Function App in the Azure portal and click the “Refresh” button on the lower left to fix the trigger registration. This bug with publish will be fixed on the Azure side soon. To verify our function is working correctly in Azure, we’ll click the “Logs” button on the function’s page, and then push a message into the Queue using Storage Explorer again. We should see a message that the function successfully processed the message The last thing to note, is that it is possible to remote debug a C# function running in Azure from Visual Studio. To do this: Open Cloud Explorer Browse to the Function App Right click and choose “Attach Debugger” Known Limitations As previously mentioned, this is the first preview of these tools, and we have several known limitations with them. They are as follow: IntelliSense: IntelliSense support is limited, and available only for C#, and JavaScript by default. F#, Python, and PowerShell support is available if you have installed those optional components. It is also important to note that C# and F# IntelliSense is limited at this point to classes and methods defined in the same .csx/.fsx file and a few system namespaces. Cannot add new files using “Add New Item”: Adding new files to your function (e.g. .csx or .json files) is not available through “Add New Item”. The workaround is to add them using file explorer, the Add New File extension, or another tool such as Visual Studio Code. Functions published from Visual Studio are not properly registered in Azure: This is caused by a bug in the Azure service for Functions running on a Consumption plan. The workaround is to open the Function App’s page in the Azure portal and click the “Refresh” button in the bottom left. This will register the functions with Azure. Function bindings generate incorrectly when creating a C# Image Resize function: The settings for the binding “Azure Storage Blob out (imageSmall)” are overridden by the settings for the binding “Azure Storage Blob out (imageMedium)” in the generated function.json. The workaround is to go to the generated function.json and manually edit the “imageSmall” binding. Cannot use project names with a “.” character: The Azure Functions CLI version 1.0.0-beta.8 will not work if launched from a folder with a “.” character (tracked by this GitHub issue). The workaround is to use spaces or dashes until this bug is fixed. Local deployment and web deploy packages are not supported: Currently, only Web Deploy to App Service is supported. If you try to use Local Deploy or a Web Deploy Package, you’ll see the error “GatherAllFilesToPublish does not exist in the project”. The Publish Preview shows all files in the project’s folder even if they are not part of the project: Publish preview does not function correctly, and will cause all files in the project folder to be picked up and and published. Avoid using the Preview view. Conclusion Please download and try out this preview of Visual Studio Tools for Azure Functions and let us know who you are so we can follow up and see how they are working. Additionally, please report any issues you encounter on our GitHub repo (include “Visual Studio” in the issue title) and provide any comments or questions you have below, or via Twitter." | |
}, | |
{ | |
"id" : "1480377600-uncategorized-communitystandup-2016-11-29-notes-from-the-aspnet-community-standup-november-22-2016", | |
"site" : "MSDN blog", | |
"title": "Notes from the ASPNET Community Standup – November 22 2016", | |
"url": "http://localhost:4000/uncategorized/communitystandup/2016/11/29/notes-from-the-aspnet-community-standup-november-22-2016.html", | |
"categories" : ["Uncategorized","CommunityStandup"], | |
"tags" : ["Uncategorized","CommunityStandup"], | |
"authors" : ["Maria Naggaga"], | |
"publishedDate" : "2016-11-29 00:00:00 +0000", | |
"content" : "This is the next in a series of blog posts that will cover the topics discussed in the ASP.NET Community Standup. The community standup is a short video-based discussion with some of the leaders of the ASP.NET development teams covering the accomplishments of the team on the new ASP.NET Core framework over the previous week. Join Scott Hanselman, Damian Edwards, Jon Galloway and an occasional guest or two discuss new features and ask for feedback on important decisions being made by the ASP.NET development teams. This week the team hosted the standup on Aerial Spaces. Every week’s episode is published on YouTube for later reference. The team answers your questions LIVE and unfiltered. This is your chance to ask about the why and what of ASP.NET! Join them each Tuesday on live.asp.net where the meeting’s schedule is posted and hosted. ASP.NET Community Standup 11/22/2016 Community Links Announcing the Fastest ASP.NET Yet, ASP.NET Core 1.1 RTM Announcing .NET Core 1.1 App Service on Linux now supports Containers and ASP.NET Core ASP.NET Core Framework Benchmarks Round 13 MVP Hackathon 2016: Cool Projects from Microsoft MVPs Damian Edwards live coding live.asp.net EDI.Net Serializer/Deserializer ASP.NET Core’s URL Rewrite Middleware behind a load balancer ASP.NET Core Workshops and Code Labs Unexpected Behavior in LanguageViewLocationExpander Project.json to CSproj OrchardCMS Roadmap ASP.NET Core and the Enterprise Part 3: Middleware Using .NET Core Configuration with legacy projects High-Performance Data Pipelines .NET Core versioning Not your granddad’s .NET – Pipes Part 1 Accomplishments Tech Empower Benchmark Tech Empower Benchmark Round 13 came out and ASP.NET Core is Top 10 receiving 1,822,366 requests per second on ASP.NET Core in Round 13. Read more Question and Answers Question: Will there be MVC 4 project support in Visual Studio 2017? — Removed in RC but should be coming back in the next release. Question: What should I grab ASP.NET Core 1.1 runtime or SDK? / What’s the difference between the .NET Core SDK and runtime? — In short, if you a developer you want to install the .NET Core SDK. If you are server administrator you may only want install the runtime. Question: Will .csproj tooling be finalized with Visual Studio 2017 RTM? — Yes, that is the current plan in place. There are couple of know issues for ASP.NET Core support in Visual Studio 2017; we have listed the workarounds on our GitHub repo. Question: How for along is the basic pipeline API? — Currently, being tested by some folks at Stack Overflow. If you would like to get involved tweet David Fowler. Question: When will URL based cultural localization be available? — It’s available now. With ASP.NET Core 1.1 Middleware as MVC filters. In this example from the ASP.NET Core 1.1 announcement we used a route value based request culture provider to establish the current culture for the request using the localization middleware. The team will be back on Tuesday the 29th of November to discuss the latest updates on ASP.NET Core. See you then!" | |
}, | |
{ | |
"id" : "1480291200-breakfast-casserole-lunch-photo-picnics-spinach-tarts-quiche-vegetarian-weeknight-20favorite-2016-11-28-spinach-sheet-pan-quiche", | |
"site" : "smitten kitchen Blog", | |
"title": "spinach sheet pan quiche", | |
"url": "http://localhost:4000/breakfast/casserole/lunch/photo/picnics/spinach/tarts/quiche/vegetarian/weeknight%20favorite/2016/11/28/spinach-sheet-pan-quiche.html", | |
"categories" : ["Breakfast","Casserole","Lunch","Photo","Picnics","Spinach","Tarts/Quiche","Vegetarian","Weeknight Favorite"], | |
"tags" : ["Breakfast","Casserole","Lunch","Photo","Picnics","Spinach","Tarts/Quiche","Vegetarian","Weeknight Favorite"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-28 00:00:00 +0000", | |
"content" : "I know we all associate December with cookies, cocktails, yule logs and latkes, but what about the smaller, enduring festivities that often go overlooked, namely workplace and other potluck luncheons? Because my “coworkers” are basically a laptop and occasionally these wild things, my current participation level is limited, but I know that usually what happens is that it’s rather easy to bring cookies and cakes but as nobody wants to drag a roast on the subway and then heat it up in the breakroom microwave, main dishes are harder to nail down. Read more »" | |
}, | |
{ | |
"id" : "1479772800-uncategorized-communitystandup-2016-11-22-notes-from-the-aspnet-community-standup-november-1-2016", | |
"site" : "MSDN blog", | |
"title": "Notes from the ASPNET Community Standup – November 1 2016", | |
"url": "http://localhost:4000/uncategorized/communitystandup/2016/11/22/notes-from-the-aspnet-community-standup-november-1-2016.html", | |
"categories" : ["Uncategorized","CommunityStandup"], | |
"tags" : ["Uncategorized","CommunityStandup"], | |
"authors" : ["Maria Naggaga"], | |
"publishedDate" : "2016-11-22 00:00:00 +0000", | |
"content" : "This is the next in a series of blog posts that will cover the topics discussed in the ASP.NET Community Standup. The community standup is a short video-based discussion with some of the leaders of the ASP.NET development teams covering the accomplishments of the team on the new ASP.NET Core framework over the previous week. Join Scott Hanselman, Damian Edwards, Jon Galloway and an occasional guest or two discuss new features and ask for feedback on important decisions being made by the ASP.NET development teams. Each week the standup is hosted live on Google Hangouts and the team publishes the recorded video of their discussion to YouTube for later reference. The guys answer your questions LIVE and unfiltered. This is your chance to ask about the why and what of ASP.NET! Join them each Tuesday on live.asp.net where the meeting’s schedule is posted and hosted. ASP.NET Community Standup 11/01/2016 Community Links Puma Scan is a software security Visual Studio analyzer extension that is built on top of Roslyn. Plug ASP.NET Core Middleware in MVC Filters Pipeline Building An API with NancyFX 2.0 + Dapper .NET Standard based Windows Service support for .NET Accessing the HTTP Context on ASP.NET Core Accessing services when configuring MvcOptions in ASP.NET Core Adding Cache-Control headers to Static Files in ASP.NET Core Building .Net Core On Travis CI Umbraco CLI running on ASP.NET Core Testing SSL in ASP.NET Core ASP.NET API Versioning Creating a new .NET Core web application, what are your options? Using MongoDB .NET Driver with .NET Core WebAPI ASP.NET Core project targeting .NET 4.5.1 running on Raspberry Pi Free ASP.NET Core 1.0 Training on Microsoft Virtual Academy Using dotnet watch test for continuous testing with .NET Core and XUnit.net Azure Log Analytics ASP.NET Core Logging extension Bearer Token Authentication in ASP.NET Core ASP.NET Core ModuleRemoval of dnvm scripts for the aspnet/home repo Demos ASP.NET Core 1.1 Preview 1 added a couple of new features around Azure integration, performance and more. In this Community Standup Damian walks us through how he easily upgraded live.asp.net site to ASP.NET Core 1.1, as well as, how to add View Compilation and Azure App Services. Upgrading Existing Projects Before you start using any of the ASP.NET Core 1.1 Preview 1 features makes sued to update the following: Install .NET Core 1.1 Preview 1 SDK Upgrade existing project from .NET Core 1.0 to .NET Core 1.1 Preview 1. Make sure to also updated your ASP.NET Core packages to their latest versions 1.1.0-preview1. Update the netcoreapp1.0 target framework to netcoreapp1.1. View compilation Damian went over how he added View compilation to live.asp.net. Typically your razor pages get complied the first time someone visits the site. The advantage of View compilation is, you can now precompile the razor views that your application references and deploy them. This features allow for faster startup times in your application since your views are ready to go. To start using precompiled views in your application follow the following steps. Add View compilation package \"Microsoft.AspNetCore.Mvc.Razor.ViewCompilation.Design\": { \"version\": \"1.1.0-preview4-final\", \"type\": \"build\" } Add View compilation tool \"Microsoft.AspNetCore.Mvc.Razor.ViewCompilation.Tools\": { \"version\": \"1.1.0-preview4-final\" } Include the post publish script to evoke pre-compilation Now, that live.asp.net is configured to use view compilation, it will pre-compile the razor views. Once you’ve published your application, you will notice that your PublishOutput folders no longer contains a view folder. Instead, you will see appname.PrecompileViews.dll. Azure App Service logging Provider Damian also configured live.asp.net to use Azure App services. By adding Microsoft.AspNetCore.AzureAppServicesIntegration package , and calling the UseAzureAppservices method in Program.cs Diagnostic logs are now turned on in Azure.(see image below). With Application Logging turned on,you can choose the log level you want and see them in Kudu console, or Visual Studio. (see image below) [](https://msdnshared.blob.core.windows.net/media/2016/11/AppServices-ViewLoggingInKuduConsole-150.png) _Application Logs in Kudu_ This week Damian went over how to use some of the new features in ASP.NET Core 1.1 Preview 1. For more details on ASP.NET Core 1.1 please check out the announcement from last month. Thanks for watching." | |
}, | |
{ | |
"id" : "1479772800-uncategorized-2016-11-22-mvp-hackathon-2016-cool-projects-from-microsoft-mvps", | |
"site" : "MSDN blog", | |
"title": "MVP Hackathon 2016 Cool Projects from Microsoft MVPs", | |
"url": "http://localhost:4000/uncategorized/2016/11/22/mvp-hackathon-2016-cool-projects-from-microsoft-mvps.html", | |
"categories" : ["Uncategorized"], | |
"tags" : ["Uncategorized"], | |
"authors" : ["Jeffrey T. Fritz"], | |
"publishedDate" : "2016-11-22 00:00:00 +0000", | |
"content" : "Last week was the annual MVP Summit on Microsoft’s Redmond campus. We laughed, we cried, we shared stories around the campfire, and we even made s’mores. Ok, I’m stretching it a bit about the last part, but we had a good time introducing the MVPs to some of the cool technologies you saw at Connect() yesterday, and some that are still in the works for 2017. As part of the MVP Summit event, we hosted a hackathon to explore some of the new features and allow attendees to write code along with Microsoft engineers and publish that content as an open source project. We shared the details of some of these projects with the supervising program managers covering Visual Studio, ASP.NET, and the .NET framework. Those folks were impressed with the work that was accomplished, and now we want to share these accomplishments with you. This is what a quick day’s worth of work can accomplish when working with your friends. [](https://msdnshared.blob.core.windows.net/media/2016/11/HackathonGroup-cropped.jpg) MVP Hackers at the end of the Hackathon Shaun Luttin wrote a console application in F# that plays a card trick. Source code at: https://github.com/shaunluttin/magical-mathematics Rainer Stropek created a docker image to fully automate the deployment and running of a Minecraft server with bindings to allow interactions with the server using .NET Core. Rainer summarized his experience and the docker image on his blog Tanaka Takayoshi wrote an extension command called “add” for the dotnet command-line interface. The Add command helps format new classes properly with namespace and initial class declaration code when you are working outside of Visual Studio. Tanaka’s project is on GitHub. Tomáš Herceg wrote an extension for Visual Studio 2017 that supports development with the DotVVM framework for ASP.NET. DotVVM is a front-end framework that dramatically simplifies the amount of code you need to write in order to create useful web UI experiences. His project can be found on GitHub at: https://github.com/riganti/dotvvm See the animated gif below for a sample of how DotVVM can be coded in Visual Studio 2017: [](https://msdnshared.blob.core.windows.net/media/2016/11/dotvvm_intellisense.gif) DotVVM Intellisense in action The ASP.NET Monsters wrote Pugzor, a drop-in replacement for the Razor view engine using the “Pug” JavaScript library as the parser and renderer. It can be added side-by-side with Razor in your project and enabled with one line of code. If you have Pug templates (previously called Jade) these now work as-are inside ASP.NET Core MVC. The ASP.NET Monsters are: Simon Timms, David Paquette and James Chambers [](https://msdnshared.blob.core.windows.net/media/2016/11/pugzor.jpg) Pugzor Alex Sorkoletov wrote an addin for Xamarin Studio that helps to clean up unused using statements and sort them alphabetically on every save. The project can be found at: https://github.com/alexsorokoletov/XamarinStudio.SortRemoveUsings Remo Jansen put together an extension for Visual Studio Code to display class diagrams for TypeScript. The extension is in alpha, but looks very promising on his GitHub project page. [](https://msdnshared.blob.core.windows.net/media/2016/11/VsCodeTsUmlPreview.gif) Visual Studio Code – TypeScript UML Generator Giancarlo Lelli put together an extension to help deploy front-end customizations for Dynamics 365 directly from Visual Studio. It uses the TFS Client API to detect any changes in you workspace and check in everything on your behalf. It is able to handle conflicts that prevents you to overwrite the work of other colleagues. The extension keeps the same folder structure you have in your solution explorer inside the CRM. It also supports adding the auto add of new web resources to a specific CRM solution. This extension uses the VS output window to provide feedback during the whole publish process. The project can be found on its GitHub page. [](https://msdnshared.blob.core.windows.net/media/2016/11/PublishToDynamics.png) Publish to Dynamics Simone Chiaretta wrote an extension for the dotnet command-line tool to manage the properties in .NET Core projects based on MSBuild. It allows setting and removing the version number, the supported runtimes and the target framework (and more properties are being added soon). And it also lists all the properties in the project file. You can extend your .NET CLI with his NuGet package or grab the source code from GitHub. He’s written a blog post with more details as well. [](https://msdnshared.blob.core.windows.net/media/2016/11/dotnet-prop.png) The dotnet prop command Nico Vermeir wrote an amazing little extension that enables the Surface Dial to help run the Visual Studio debugger. He wrote a blog post about it and published his source code on GitHub. David Gardiner wrote a Roslyn Analyzer that provides tips and best practice recommendations when authoring extensions for Visual Studio. Source code is on GitHub. [](https://msdnshared.blob.core.windows.net/media/2016/11/VsixAnalyzers.gif) VSIX Analyzers Cecilia Wirén wrote an extension for Visual Studio that allows you to add a folder on disk as a solution folder, preserving all files in the folder. Cecilia’s code can be found on GitHub [](https://msdnshared.blob.core.windows.net/media/2016/11/AddAsSolutionFolder.gif) Add Folder as Solution Folder Terje Sandstrom updated the NUnit 3 adapter to support Visual Studio 2017. [](https://msdnshared.blob.core.windows.net/media/2016/11/Nunit.png) NUnit Results in Visual Studio 2017 Ben Adams made the Kestrel web server for ASP.NET Core 8% faster while sitting in with some of the ASP.NET Core folks. Summary We had an amazing time working together, pushing each other to develop and build more cool things that could be used with Visual Studio 2015, 2017, Code, and Xamarin Studio. Stepping away from the event, and reading about these cool projects inspires me to write more code, and I hope it does the same for you. Would you be interested in participating in a hackathon with MVPs or Microsoft staff? Let us know in the comments below" | |
}, | |
{ | |
"id" : "1479686400-aspnet-debugging-visual-20studio-202017-2016-11-21-clientside-debugging-of-aspnet-projects-in-google-chrome", | |
"site" : "MSDN blog", | |
"title": "Clientside debugging of ASPNET projects in Google Chrome", | |
"url": "http://localhost:4000/aspnet/debugging/visual%20studio%202017/2016/11/21/clientside-debugging-of-aspnet-projects-in-google-chrome.html", | |
"categories" : ["AspNet","Debugging","Visual Studio 2017"], | |
"tags" : ["AspNet","Debugging","Visual Studio 2017"], | |
"authors" : ["Mads Kristensen"], | |
"publishedDate" : "2016-11-21 00:00:00 +0000", | |
"content" : "Visual Studio 2017 RC now supports client-side debugging of both JavaScript and TypeScript in Google Chrome. For years, it has been possible to debug both the backend .NET code and the client-side JavaScript code running in Internet Explorer at the same time. Unfortunately, the capability was limited solely to Internet Explorer. In Visual Studio 2017 RC that changes. You can now debug both JavaScript and TypeScript directly in Visual Studio when using Google Chrome as your browser of choice. All you should do is to select Chrome as your browser in Visual Studio and hit F5 to debug. If you’re interested in giving us feedback on future features and ideas before we ship them, join our community. The first thing you’ll notice when launching Chrome by hitting F5 in Visual Studio is a page that says, “Please wait while we attach…”. What happens is that Visual Studio is attaching to Chrome using the remote debugging protocol and then redirects to the ASP.NET project URL (something like http://localhost:12345) after it attaches. After the attach is complete, the “Please wait while we attach…” message remains visible while the ASP.NET site starts up where normally you’d see a blank browser during this time. Once the debugger is attached, script debugging is now enabled for all JavaScript files in the project as well as all TypeScript files if there is source map information available. Here’s a screen shot of a breakpoint being hit in a TypeScript file. For TypeScript debugging you need to instruct the compiler to produce a .map file. You can do that by placing a tsconfig.json file in the root of your project and specify the a few properties, like so: { \"compileOnSave\": true, \"compilerOptions\": { \"sourceMap\": true } } There are developers who prefer to use Chrome’s or IE’s own dev tools to do client-side debugging and that is great. There will be a setting in Visual Studio that allows you to disable client-side debugging in both IE and Chrome, but unfortunately that didn’t make it in to the release candidate. We hope you’ll enjoy this feature and we would love to hear your feedback in the comments section below, or via Twitter. Download Visual Studio 2017 RC" | |
}, | |
{ | |
"id" : "1479686400-apple-brussels-20sprouts-fall-gluten-free-onions-photo-pomegranate-salad-side-20dish-thanksgiving-tips-vegetarian-winter-2016-11-21-brussels-sprouts-apple-and-pomegranate-salad", | |
"site" : "smitten kitchen Blog", | |
"title": "brussels sprouts apple and pomegranate salad", | |
"url": "http://localhost:4000/apple/brussels%20sprouts/fall/gluten-free/onions/photo/pomegranate/salad/side%20dish/thanksgiving/tips/vegetarian/winter/2016/11/21/brussels-sprouts-apple-and-pomegranate-salad.html", | |
"categories" : ["Apple","Brussels Sprouts","Fall","Gluten-Free","Onions","Photo","Pomegranate","Salad","Side Dish","Thanksgiving","Tips","Vegetarian","Winter"], | |
"tags" : ["Apple","Brussels Sprouts","Fall","Gluten-Free","Onions","Photo","Pomegranate","Salad","Side Dish","Thanksgiving","Tips","Vegetarian","Winter"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-21 00:00:00 +0000", | |
"content" : "Things I Learned Hosting My First Friendsgiving On logistics • As I realized last week, what makes big meals (we had 16 people) scary isn’t the cooking as much as the sheer volume of it all and the logistics required to manage them. I mean, who here has a kitchen that was built to feed 16? Trust me, it’s not you, it’s your kitchen making things hard. Read more »" | |
}, | |
{ | |
"id" : "1479254400-aspnetcore-azure-announcement-announcements-app-20service-container-2016-11-16-put-a-net-core-app-in-a-container-with-the-new-docker-tools-for-visual-studio", | |
"site" : "MSDN blog", | |
"title": "Put a NET Core App in a Container with the new Docker Tools for Visual Studio", | |
"url": "http://localhost:4000/aspnetcore/azure/announcement/announcements/app%20service/container/2016/11/16/put-a-net-core-app-in-a-container-with-the-new-docker-tools-for-visual-studio.html", | |
"categories" : ["AspNetCore","Azure","Announcement","Announcements","app service","container"], | |
"tags" : ["AspNetCore","Azure","Announcement","Announcements","app service","container"], | |
"authors" : ["Jeffrey T. Fritz"], | |
"publishedDate" : "2016-11-16 00:00:00 +0000", | |
"content" : "By now hopefully you’ve heard the good news that we’ve added first class support for building and running .NET applications inside of Docker containers in Visual Studio 2017 RC. Visual Studio 2017 and Docker support building and running .NET applications using Windows containers (on Windows 10/Server 2016 only), and .NET Core applications on Linux containers, including the ability to publish and run Linux containers on Microsoft’s Azure App Service. Docker containers package an application with everything it needs to run: code, runtime, system tools, system libraries – anything you would install on a server. Put simply, a container is an isolated place where an application can run without affecting the rest of the system, and without the system affecting the application. This makes them an ideal way to package and run applications in production environments, where historically constraints imposed by the production environment (e.g. which version of the .NET runtime the server is running) have dictated development decisions. Additionally, Docker containers are very lightweight which enable scaling applications quickly by spinning up new instances. In this post, I’ll focus on creating a .NET Core application, publishing it to the Azure App Service Linux Preview and setting up continuous build integration and delivery to the Azure Container Service. Getting Started To get started in Visual Studio 2017 you need to install the “.NET Core and Docker (Preview)” workload in the new Visual Studio 2017 installer Once it finishes installing, you’ll need to install Docker for Windows (if you want to use Windows containers on Windows 10 or Server 2016 you’ll need the Beta channel and the Windows 10 Anniversary Edition, if you want Linux containers you can choose either the Stable or Beta channel installers). After you’ve finished installing Docker, you’ll need to share a drive with it where your images will be built to and run from. To do this: Right click on the Docker system tray icon and choose settings Choose the “Shared Drives” tab Share the drive your images will run from (this is the same drive the Visual Studio project will live on) Creating an application with Docker support Now that Visual Studio and Docker are installed and configured properly let’s create a .NET Core application that we’ll run in a Linux container. On the ASP.NET application dialog, there is a checkbox that allows us to add Docker support to the application as part of project creation. For now, we’ll to skip this, so we can see how to add Docker support existing applications. Now that we have our basic web application, let’s add a quick code snippet to the “About” page that will show what operating system the application is running on Next, we’ll hit Ctrl + F5 to run it inside IIS Express, we can see we’re running on Windows as we would expect. Now, to add Docker support to the application, right click on the project in Solution Explorer, choose Add, and then “Docker Project Support” (use “Docker Solution Support” to create containers for multiple projects). You’ll see that the “Start” button has changed to say “Docker” and several Docker files have been added to the project. Let’s hit Ctrl+F5 again and we can see that the app is now running inside a Linux container locally. Running the application in Azure Now let’s publish the app to Microsoft Azure App Service, which now offers the ability to run Linux Docker containers in a preview form. To do this, I’ll right click on the app and choose “Publish”. This will open our brand new publish page. Click the “Profiles” dropdown and select “New Profile”, and then choose “Azure App Service Linux (Preview)” and click “OK” Before proceeding it’s important to understand the anatomy of how a Docker application works in a production environment: A container registry is created that the Docker image is published to The App Service site is created that downloads the image from the container registry and runs it At any time, you can push a new image to the container registry which will then result in the copy running in App Service being updated. With that understanding, let’s proceed to publishing our application to Azure. The next thing we’ll see is the Azure provisioning dialog. There are a couple of things to note about using this dialog in the RC preview: If you are using an existing Resource Group, it must be in the same region as the App Service Plan you are creating If you are creating a new Resource Group, you must set the Container Registry and the App Service plan to be in the same region (e.g. both must be in “West US”) The VM size of the App Service Plan must be “S1” or larger When we click “OK” it will take about a minute, and then we’ll return to the “Publish” page, where we’ll see a summary of the publish profile we just created. Now we click “Publish” and it will take about another minute during which time you’ll see a Docker command prompt pop up When the application is ready, your browser will open to the site, and we can see that we’re running on Linux in Azure! Setting up continuous build integration and delivery to the Azure Container Service Now let’s setup continuous build delivery to Microsoft Azure Container Service. To do this, I’ll right click on the project and choose “Configure Continuous Delivery…”. This will bring up a continuous delivery configuration dialog. On the Configure Continuous Delivery dialog, select a user account with a valid Azure subscription as well as An Azure subscription with a valid Container registry and a DC/OC orchestrator Azure Container Service. When done, click OK to start the setup process. A dialog will pop-up to explain that the setup process started. As the continuous build delivery setup can take several minutes to complete, you may consult the ‘Continuous Delivery Tools’ output tool window later to inspect the progress. Upon successful completion of the setup, the output window will display the configuration details used to create the build and release definitions on VSTS to enable continuous build delivery for the project to the Azure Container Service. Conclusion Please download Visual Studio 2017 today, and give our .NET Core and Docker experience a try. It’s worth noting that this is a preview of the experience, so please help us make it great by providing feedback in the comments below." | |
}, | |
{ | |
"id" : "1479254400-fall-photo-pumpkin-tarts-pies-thanksgiving-2016-11-16-cheesecakemarbled-pumpkin-slab-pie", | |
"site" : "smitten kitchen Blog", | |
"title": "cheesecakemarbled pumpkin slab pie", | |
"url": "http://localhost:4000/fall/photo/pumpkin/tarts/pies/thanksgiving/2016/11/16/cheesecakemarbled-pumpkin-slab-pie.html", | |
"categories" : ["Fall","Photo","Pumpkin","Tarts/Pies","Thanksgiving"], | |
"tags" : ["Fall","Photo","Pumpkin","Tarts/Pies","Thanksgiving"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-16 00:00:00 +0000", | |
"content" : "So, I’m deep in my Friendsgiving planning for this weekend and I think I finally understand — and really, it’s about time, Deb — why Thanksgiving is so daunting, even for people who like to cook: it’s the volume. Read more »" | |
}, | |
{ | |
"id" : "1478736000-casserole-fall-fennel-freezer-20friendly-onions-photo-potatoes-side-20dish-thanksgiving-vegetarian-winter-2016-11-10-root-vegetable-gratin", | |
"site" : "smitten kitchen Blog", | |
"title": "root vegetable gratin", | |
"url": "http://localhost:4000/casserole/fall/fennel/freezer%20friendly/onions/photo/potatoes/side%20dish/thanksgiving/vegetarian/winter/2016/11/10/root-vegetable-gratin.html", | |
"categories" : ["Casserole","Fall","Fennel","Freezer Friendly","Onions","Photo","Potatoes","Side Dish","Thanksgiving","Vegetarian","Winter"], | |
"tags" : ["Casserole","Fall","Fennel","Freezer Friendly","Onions","Photo","Potatoes","Side Dish","Thanksgiving","Vegetarian","Winter"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-10 00:00:00 +0000", | |
"content" : "Last year, I proudly announced my intentions to host a Friendsgiving dinner for our crew and we would do it up. About 15 minutes later, I remembered that I had an infant and a zillion other less cute things on my plate and came to my senses. This year, I am a woman unwaveringly of my word, and I have 9 days to get my act together. Read more »" | |
}, | |
{ | |
"id" : "1478217600-apple-austrian-cake-fall-german-photo-2016-11-04-apple-strudel", | |
"site" : "smitten kitchen Blog", | |
"title": "apple strudel", | |
"url": "http://localhost:4000/apple/austrian/cake/fall/german/photo/2016/11/04/apple-strudel.html", | |
"categories" : ["Apple","Austrian","Cake","Fall","German","Photo"], | |
"tags" : ["Apple","Austrian","Cake","Fall","German","Photo"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-04 00:00:00 +0000", | |
"content" : "Because I don’t say it often enough, do know that one of my favorite things about this site is the way your presence, whether active or lurking, quietly provides the encouragement I need every time I want to tackle a dish or recipe that daunts me. Like bagels. Or Lasagna Bolognese. Or Baked Alaska. Or Russian Honey Cake. But I’m not sure that any of these dishes have struck terror in my heart — laced with impending doom over inevitable failure — over a dish as much as this. Read more »" | |
}, | |
{ | |
"id" : "1477958400-cauliflower-fall-photo-side-20dish-thanksgiving-vegetarian-2016-11-01-roasted-cauliflower-with-pumpkin-seeds-brown-butter-and-lime", | |
"site" : "smitten kitchen Blog", | |
"title": "roasted cauliflower with pumpkin seeds brown butter and lime", | |
"url": "http://localhost:4000/cauliflower/fall/photo/side%20dish/thanksgiving/vegetarian/2016/11/01/roasted-cauliflower-with-pumpkin-seeds-brown-butter-and-lime.html", | |
"categories" : ["Cauliflower","Fall","Photo","Side Dish","Thanksgiving","Vegetarian"], | |
"tags" : ["Cauliflower","Fall","Photo","Side Dish","Thanksgiving","Vegetarian"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-11-01 00:00:00 +0000", | |
"content" : "What did you do the last time you bought a head of cauliflower? Steam it? Grind it into rice? Puree it into a heap of hope that nobody will notice it’s not mashed potatoes? Roast it to a crisp, brown oblivion with olive oil and salt and eat it straight of the baking sheet? You sound sane. Read more »" | |
}, | |
{ | |
"id" : "1477267200-arugula-fall-freezer-20friendly-italian-meat-pasta-photo-winter-2016-10-24-broken-pasta-with-pork-ragu", | |
"site" : "smitten kitchen Blog", | |
"title": "broken pasta with pork ragu", | |
"url": "http://localhost:4000/arugula/fall/freezer%20friendly/italian/meat/pasta/photo/winter/2016/10/24/broken-pasta-with-pork-ragu.html", | |
"categories" : ["Arugula","Fall","Freezer Friendly","Italian","Meat","Pasta","Photo","Winter"], | |
"tags" : ["Arugula","Fall","Freezer Friendly","Italian","Meat","Pasta","Photo","Winter"], | |
"authors" : ["deb"], | |
"publishedDate" : "2016-10-24 00:00:00 +0000", | |
"content" : "At the end of July, a generally broiling, sticky month in New York City best experienced somewhere far enough away to catch a breeze not recently emitted from subway grates, I spied a recipe for a pork shoulder braised in chicken stock, aromatics, celery and thyme then torn into bite-sized shreds and tossed with broken-up pieces of lasagna noodles and finished with butter, lemon juice, parmesan and arugula that sounded so good, I had to make it the very next night for dinner. Even though it was 82 degrees out. Even though we’d been to the beach that weekend. I regretted nothing. Read more »" | |
}] |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment