marshall kirkpatrick Why Tech Companies Need Simpler Terms of Servic… 8 Best WordPress Hosting Solutions on the Market Related Posts A Web Developer’s New Best Friend is the AI Wai… Tags:#Analysis#web Top Reasons to Go With Managed WordPress Hosting In a jaw dropping move of bizarreness, Wall St. Journal writers Emily Steel and Jessica E. Vascellaro have called out major social networking websites tonight for violating user privacy apparently by passing profile page URLs to advertisers as the referring URLs when users click on ads. We’ve emailed both writers to ask for clarification in the event that they are in fact referring to something else, but haven’t heard back from them yet. Update:Vascellaro has responded by email, emphasizing an apparently now-resolved if legitimate issue discussed vaguely as “in some cases” in the original story. Conflating that and the simple matter of referring URLs seems odd, to say the least. That said, it does appear that there was some grounds for debate around what was being communicated in some URLs. I’ve added some more thoughts, along with the text of Vascellaro’s more clear explanation by email, to the footer of this post. I don’t think the situation is as crazy now as I did when I first read it and wrote this post.“Facebook, MySpace and several other social-networking sites have been sending data to advertising companies that could be used to find consumers’ names and other personal details, despite promises they don’t share such information without consent,” the article begins. See also: EFF Finds That Every Browser Has Unique Fingerprint“Across the Web, it’s common for advertisers to receive the address of the page from which a user clicked on an ad. Usually, they receive nothing more about the user than an unintelligible string of letters and numbers that can’t be traced back to an individual. With social networking sites, however, those addresses typically include user names that could direct advertisers back to a profile page full of personal information. In some cases, user names are people’s real names.”It’s just incredible. Go read it for yourself. Or don’t. The tone of the article implies that some major scandal has been broken wide open. To be fair, some other people we’ve spoken with tonight agree with the Journal’s assessment of the situation. This sure reads like anti-technology fear-mongering to me though, and I’ve been one of Facebook’s very loudest critics regarding privacy. Related but perhaps less surprising coverage of the Journal’s story comes from Gawker, with the over-the-top headline Facebook Secretly Sold Your Identity to Advertisers. Hello, pageviews!The Journal writers do allude to something a step beyond referring URLs when they write: “But Facebook went further than other sites, in some cases signaling which user name or ID was clicking on the ad as well as the user name or ID of the page being viewed.” Those additional cases weren’t discussed any more explicitly.The Journal coverage even went so far as to claim that some social networks changed their behavior once questioned by said venerable publication! Facebook, according the the Journal, eliminated those mysterious “other cases” upon being questioned. So problem fixed right?Of course anyone who has ever looked at a website’s traffic logs knows that referring URLs are shown to destination domains. [Ok, so that’s not actually very many people in the world.] And yes, on social networks sometimes those URLs include profile names. As the Journal acknowledged, that doesn’t mean it was the profile owner who clicked on the ad.As the Journal’s own coverage said:A Twitter spokeswoman said passing along the Web address happens when people click a link from any Web page. “This is just how the Internet and browsers work,” she said.That’s right. That’s just how the Internet works.Privacy and Facebook are serious issues. It’s irresponsible and unhelpful to report on them like this. If we’re reading this wrong, then at the very least it’s being communicated poorly.Update: Vascellaro’s email to us in response:Facebook was making it possible for advertisers to see ids for users who clicked (not just the profile url). This was happening through a ref equals profile code getting passed through after a user clicked on their profile and then an ad. Facebook acknowledged that this could be used to identify users who clicked, not just the profile of the user on whose page an ad appeared. They changed this after we alerted them to it, so it cannot currently be demonstrated.Others are just passing urls on pages viewed but myspace and fb said — and we reported –are working to obscure those too as it could be construed as personally identifiable data about some users, if not the users who clicked. Of course, whether people view it as personally identifiable varies, as we say. They are, however, changing it.Decide for yourself then, readers.Updated upon further reflection: I think if it had been put like this, the WSJ story would have been more more clear: Facebook used to, in some cases, send referring URLs with logged-in user IDs inside the URL when a user clicked on an ad. The Journal alerted them to that situation and they now obfuscate those URLs. That’s good. Potential privacy situation dealt with. Unfortunately, this is something that is hard to explain to non-technical readers and in its attempt to do so, I believe the Journal’s coverage left more technical readers confused and concerned that all referring URLs were being criticized unfairly. That is my working understanding of the situation right now.
Tags:#Real World#web Top Reasons to Go With Managed WordPress Hosting A Web Developer’s New Best Friend is the AI Wai… Related Posts Hey, remember when the IBM computer Deep Blue went up against a chess grandmaster? That was cute. Well now a “DeepQA” supercomputer will go up against someone with a brain in his head: a Jeopardy champion. Actually, the computer, named “Watson,” will go up against two of the winningest players in the show’s history, Ken Jennings and Brad Rutter.Natural Language MachineWatson is a “natural language processing question answer machine.” If that makes you think of the erstwhile search engine Ask Jeeves, it should. Though no doubt the engineers behind Watson are praying to their clanking robot gods that it won’t answer a question like “Where is CIA headquarters?” with a photo gallery featuring grown men in diapers. At any rate, Watson (named after the company’s founder, not the pistol-packing pimp from the Sherlock Holmes stories), will take on Jennings and Rutter in two matches over three days, February 14, 15 and 16. It has already done a number of test matches against former contestant and passed the Jeopardy knowledge test all contestants must take to qualify for the show. The company’s announcement outlines the goals behind creating the computer. “IBM scientists . . . set out to accomplish a grand challenge – build a computing system that rivals a human’s ability to answer questions posed in natural language with speed, accuracy and confidence. The Jeopardy! format provides the ultimate challenge because the game’s clues involve analyzing subtle meaning, irony, riddles, and other complexities in which humans excel and computers traditionally do not.”Lord. Good luck. Unfortunately, the company did not release the kind of technical information that might allow us to determine how likely it is that the computer is really “understanding” the questions in any substantive way, vs. just employing a fancy keyword algorithm. But we’ve contacted them for a comment. Dr. RobotIBM says the real-world applications of the technology that powers Watson “could be applied in areas such as healthcare, to help accurately diagnose patients, to improve online self-service help desks, to provide tourists and citizens with specific information regarding cities, prompt customer support via phone” and more. (Let’s hope Dr. Robot comes with a complete set of Asimov’s three laws of robotics. Insert your own Borg or remade joke here.)The challenge pays. The winner will take $1 million, second place $300,000 and third $200,000. IBM has committed to donating all of its prize money to charity and the humanoids have committed to donating 50% of theirs. Why Tech Companies Need Simpler Terms of Servic… curt hopkins Other sources: Smarter Planet 8 Best WordPress Hosting Solutions on the Market
The news that Google may be considering its own mobile payments service shouldn’t actually be news to anyone who’s been following the Internet search giant’s latest moves – it’s just a matter of connecting the dots. But the insider reports over on Bloomberg Businessweek today confirm that the thought has at least crossed Google’s mind.According to “two people familiar with the plans,” Google may launch the new mobile payments service, which allows consumers to tap or wave their mobile phones at a cash register to pay for their purchases, sometime this year.Well, surprise, surprise.Connecting the Dots: Android Has NFC Google getting into payments? This isn’t as crazy an idea as it sounds. The company’s newest version of its Android mobile operating system, the revision code-named Gingerbread, has added support for Near Field Communication (NFC) technology. Although new to the U.S., many parts of Asia and Europe have long been using NFC to enable transactions, including everything from paying for subway tickets to purchasing Cokes from vending machines.NFC works by way of a small chip embedded into mobile phones or other devices (or even stickers!) that allows the device to transmit data over short distances.At the moment, the NFC support in Google’s Android software allows for one-way, read-only transmissions, but that limitation is only temporary. NXP’s Jeff Miles, the company’s director of mobile transactions, recently confirmed that Android would be updated to include both read/write support in a future version of the mobile software. The update is expected to show up in Gingerbread itself, instead of in an entirely new software version, like the forthcoming Honeycomb version, due out later this year. What it Takes to Build a Highly Secure FinTech … Tags:#Google#mobile#news#NYT#web Role of Mobile App Analytics In-App Engagement sarah perez Related Posts With read/write support in place, phones running the Android software and that include the necessary hardware would be NFC-, and therefore mobile payments-enabled.Google Goes Local with NFC Stickers, Hotpot ProgramIn addition to the technological support for NFC transactions in Android phones, Google has also launched a local advertising program called Hotpot. The program is focused on allowing businesses – primarily restaurants, bars and cafes it appears – to advertise themselves to customers by way of NFC-enabled window decals. Hotpot, still in its pilot phase in Portland, Oregon, lets a passerby wave an NFC phone at the sticker, which in turn takes them to a mobile Google Places Page for that business via their device’s Web browser.These Places Pages serve much of the same function as services like Yelp do – they provide a location’s name, address, phone number and other details alongside user-generated ratings and reviews. And you can see which establishments your friends liked, too.Google Acquires Mobile Payment Company, Tried to Get GrouponAnother recent Google acquisition, which flew under the radar until some eagle-eyed analysts at the 451 Group spotted it, was of mobile payments firm Zetawire. The small Canadian company had one thing going for it: a patent app for mobile banking, advertising, identity management, credit card and mobile coupon transaction processing. In other words, a complete mobile wallet solution.Google’s other recent, but failed, acquisition attempt – that of local couponing service Groupon – could have also tied into this mobile wallet initiative the company reportedly has in the works.Bloomberg’s article didn’t deliver much new information about any of these services, only confirming that indeed, the dots are being connected and something big from Google is well on its way.It also noted that NFC phone shipments are expected to rise to 220.1 million units by 2014, a figure that indicates that the mobile phone-toting world is ready for such a service to exist.Challenges to Google’s Mobile Payments: Carriers, Other Manufacturers, Credit Card Companies, More But Google won’t be without its challengers. PayPal is expected to dabble in NFC payments, too, this year. Apple hired an NFC expert and both Apple and RIM have filed NFC-related patents. Complete services from the likes of Visa and the mobile phone carriers themsevles have also either launched or will be launching this year. Even startups like Bling Nation are getting in on the action, NFC-enabling old phones by way of stickers.Google will have to walk a fine line if it wants to avoid the “creep” factor. People are already sensitive about the amount of data the Internet search company has on file. In fact, that issue is now being used as a marketing technique for the upstart search engine DuckDuckGo, which touts its privacy features by way of a website at donttrack.us(“When you search Google…your search term is sent to that site, along with your browser and computer info, which can identify you.”)How will people feel about a Google service that tracks their jaunts about town, their favorite local businesses, their couponing habits, their financial information, their bank accounts, their spending habits and their purchase history? Technology aside, that could be the biggest hurdle Google has to overcome to make their mobile payments business a successful one.ReadWriteWeb’s Related Coverage on NFC Developments:What’s Google’s Interest in Zetawire? An Android Mobile WalletNexus S and Gingerbread Phones to Get Full NFC Support SoonAT&T, Verizon, T-Mobile Join Forces in New Mobile Payments Venture Called “Isis”Google Aims to Replace Credit Cards with Addition of NFC to AndroidApple Hires NFC Expert, Mobile Payments Coming to iPhone? Why IoT Apps are Eating Device Interfaces The Rise and Rise of Mobile Payment Technology
9 Books That Make Perfect Gifts for Industry Ex… Related Posts 4 Keys to a Kid-Safe App Shazam, the magic mobile app that identifies music playing in the room you’re in, has added a new option for interacting with a song it’s identified: listen in Spotify. Spotify, a killer desktop and mobile music app that’s not yet available but to a few thousand testers in the United States, is widely loved in Europe. It’s a great way to discover new music, listen to an artist’s full or nearly full body of work and just generally get your groove on. I’m lucky to be listening to the band Bassnectar on Spotify as I write this post. I used Shazam earlier tonight to identify several songs, however, and did not have the option to fire those artists up in Spotify. I wish I could have. Along with creating a Pandora or Last.fm channel based on an identified artist, watching that artist’s YouTube videos, searching for their nearby tour dates and buying their music by MP3 – Spotify subscribers are now going to have a great new way to start listening to new music they discover on Shazam.The advantage of using Spotify, instead of Pandora or Last.fm, is that Spotify users can easily listen to entire albums and navigate among related artists much more freely than on most other services.According to the Spotify blog, this feature is only available to Spotify Premium subscribers at launch, but will be made available to free account holders soon. If you live in the United States and have not been fortunate enough to get one of relatively few Spotify preview accounts, prior to what everyone hopes are pending US licensing agreements with the major record labels here – well, don’t hold your breath. ReadWriteWeb cannot be held responsible if you burst while waiting.As a happy new subscriber to the similar service Rdio, I hope that Shazam will integrate listening in Rdio as well.A seamless ability to listen to entire albums of whatever musician or group is playing on the speakers in a room you’re in, with just a click or two and some awesome music identification technology, is truly incredible. Thank goodness for bandwidth, smartphones and streaming mobile music. Tags:#mobile#music#news#web 5 Outdoor Activities for Beating Office Burnout marshall kirkpatrick 12 Unique Gifts for the Hard-to-Shop-for People…
I’ve never been there, but according to a recent article in the New York Times (sorry if I am relying on this paper too much for inspiration), the city of Djenne, Mali, is a veritable museum of historic mud brick buildings. Among them is the Grand Mosque, the largest mud brick, or adobe, building in the world, originally built in the 13th century and replaced with the current building in 1907.In addition to the mosque, there are hundreds of mud brick homes in current use that, according to the city’s World Heritage site designation, may not be updated. This apparently restricts owners from making improvements such as tiling floors, adding windows to rooms that have none, and installing showers or even screen doors. These restrictions have created quite a backlash, including a riot in 2006 following an initial restoration survey.Tourist-driven urban planning?In recent years, the city has developed serious sewage problems, as there is no central sanitary system. This, along with open trash dumps in the area, caused tourists to complain to UNESCO, who warned the city that it was at risk of losing its World Heritage site designation.Apparently this designation is important to the tourism industry, which is a major source of income for the area. So, while in theory, the city welcomes the designation, the program prohibits many changes to buildings, including many interior renovations. One house is described as having a room that measures 6 feet by 3 feet, without any windows; under UNESCO regulations, the room cannot be changed from its grave-like current design. While I appreciate the efforts to avoid losing historic buildings, since when does tourism trump the right of people to improve their homes? Man, I’m starting to sound like a libertarian!I feel their painI imagine that these residents hope to improve their living conditions through home improvements, which apparently they are restricted from doing. While I make no claims that my problems with the local historic commission compare to the challenges of the residents of this World Heritage site, there are some similarities.They just want to make their homes comfortable, clean, and safe, but by doing so they run afoul of regulations. In my historic district, things I want to do that will create a higher-performance, more sustainable home are restricted, due mostly to pressure applied by a small but vocal minority in the neighborhood.While I believe that effective laws and regulations help maintain a safe and comfortable living environment, many of those laws and regulations are out of date, are counterproductive, and often lead to poor solutions that benefit no one.Is there a solution?Obviously, having no regulations isn’t the answer, but neither is more regulation necessarily a suitable solution. Some neighborhoods have elected not to seek historic designations, leaving more options for homeowners choosing to build or renovate than those living in areas that have been designated historic. I haven’t seen that being in a historic district implies better or more appropriate design; rather, it tends to satisfy that vocal minority and its particular tastes.Historic committees are made up of people who are fallible and, like most groups, tend to make decisions that comprise a range of compromises (not unlike our federal government). I’m not sure that there we will find solutions that will satisfy me and my local historic commission, or the citizens of Djenne and the administrators of the World Heritage designation. Maybe we can find a benevolent dictator to take over and judge with a fair hand. Any volunteers out there?
Four developers submit proposalsFour developers responded to the city’s request for proposals. City staffers interviewed the three most qualified firms and Homestead’s “Eden Gardens” got the preliminary OK. A formal plan with cost estimates is due in November, to be followed by public hearings. With city approval, construction could start in the spring.Homestead’s project manager, Tom Strohm, says it’s too soon to say exactly what features the houses would have to enhance energy efficiency, indoor air quality, water management and the like. But he said developers were planning to build houses so they could win advanced certification under Minnesota’s Green Path program.The program’s web site describes that as a “mid-level green certification” in which houses must have a HERS rating of no more than 60 plus win points for meeting certain energy efficiency and environmental standards. The houses would range in size from 1800 sq. ft. to 2400 sq. ft.The architect for the project is Whitten Associates of Minnetonka, Minn. Officials in a well-to-do community in the Minneapolis-St. Paul area hope they can spur the development of a neighborhood of green, energy-efficient homes with “midmarket” pricing attractive to young families.Eden Prairie, a city of about 60,000 a dozen miles southwest of Minneapolis, has informally selected Homestead Partners to develop 36 houses that would sell for between $240,000 and $360,000, according to an article in the Minneapolis Star Tribune.That may not seem like a bargain-basement price to many people, but new homes in town often sell for $600,000, the newspaper said, and in August there were twice as many houses on the market selling for $456,000 and up as there for between $285,000 and $455,000.In Minneapolis, officials are pushing a plan to build green homes that sell for no more than $200,000, with help from a $1 million annual subsidy. In Eden Prairie, officials are trying to avoid putting public money into the program. Not everyone is thrilledSupport on the Eden Prairie City Council isn’t universal, according to the Star Tribune. City Councilor Brad Aho doubts the houses can be built at the projected prices, and he doesn’t like the idea the city will buy the property from the Minnesota Department of Transportation and then sell it to the developer.Neighbors also expressed concerns because what’s now a dead-end road would be extended to provide access to the new development, meaning more noise and traffic in an area where children now ride their bikes and play. And some worry that building less expensive houses next to their own would erode property values.“In an ideal world, we’d love to keep it undeveloped,” Kathryn Atterberry told the newspaper. “It’s why we bought the house we bought.”Should the property be developed, she said, there should be fewer, more expensive houses that better match what’s already there.One possible compromise would be to build a dozen market-rate houses on the fringes of the property, presumably to shield the neighborhood from lower priced houses toward the interior.
Austin aims for 55% renewablesIt will take developments like these if the Austin City Council is to reach its aggressive efficiency goals. Last December, the city approved a plan to get 55% of its power from clean energy by 2025, Climate Progress reported.In addition to providing 600 metawatts of utility-scale solar, the proposal would have Austin Energy, the municipally owned utility, find 200 MW of local solar, at least half of which would have to be owned by customers.Energy efficiency and improvements in demand response were to provide another 800 MW of power over the next 10 years.“It’s clear that to achieve the ambitious goals Austin Energy has set for itself, we must significantly increase the number of rooftops generating power from the sun,” Austin Energy vice president of customer energy solutions Deborah Kimberly told MyStatesman. “Communities like this with solar integrated into the design from day one allow us to make faster progress toward those goals in ways that allow us to plan infrastructure and protect the overall stability of the electric grid.” The right opportunity for solarDavid Grove, Lennar’s division president for the Austin and San Antonio markets, said by telephone the company had been looking into adding solar for a number of years but until now hadn’t found a way to make it available affordably in the Austin market. SunStreet’s lease option made the difference.“I think it’s clear that homeowners want solar, they want technology,” he said. “If you asked anybody off the street, ‘Would you like solar on your home, and do you understand it makes sense?’ the answer is a resounding, ‘Yes!’ The challenge has been what a homeowner is willing to pay, and that’s always been the prohibiting factor. Now we’ve found a way to provide the solar without the homeowner paying out of pocket at a reasonable, very economical rate.”Lennar is the country’s second largest builder, with $6.8 billion in housing sales and 21,000 closings in 2014, according to Professional Builder. The company has roughly 120 solar communities underway in several states, including California, Colorado and Maryland, adding up to “several thousand rooftops” in all.Lennar expects to launch a second solar community in Austin early next year. “And strategically,” Grove said, “I think we will try to pursue it in every community we open moving forward, and then we execute it where we’re able to.”“Folks historically have looked at solar as something that you’re only going to see on expensive, million-dollar homes because they have the ability to utilize it and it’s cost prohibitive to others,” he continued. “I think as we make this more commonplace and expand the footprint it will become more and more the norm.”Grove expects most if not all buyers at Colorado Crossing will opt for a solar lease, not an outright purchase. Leasing is by far the most common way homeowners go solar, accounting for as many as 95% of PV installations at Lennar subdivisions, particularly in areas where buyers don’t have a lot of discretionary income, Grove said.Lennar is technically creating power-purchase agreements with buyers rather than straight leases, The Wall Street Journal notes, because homeowners pay only for the power they use and not the system itself. But homeowners are paying less for electricity than they would from the local utility, so they are saving money regardless of what the arrangement is called. UPDATED Oct. 8, 2015Work is underway in Austin, Texas, on two housing subdivisions that emphasize energy efficiency, including one in which all 7,500 homes will be constructed to zero-energy standards.Taurus of Texas, a real estate investment firm, said in September that it was starting construction of the first 237 houses at Whisper Valley in East Austin, according to a report at MyStatesman.com. Company officials said it would be the first large single-family development in which all houses were designed to produce as much energy as they use on an annual basis.Houses also will come with fiber optics systems from Google Fiber, which says Whisper Valley is the first time the extremely fast internet service is being installed in a new housing development.Separately, Lennar announced it would build Austin’s first “solar standard community” where each house would have its own rooftop photovoltaic (PV) array. Homeowners could either lease the panels or buy them outright.The first phase of Lennar’s Colorado Crossing subdivision, underway near the Austin-Bergstrom International Airport, will include 120 homes, ranging in size from about 1,200 square feet to 2,800 square feet and costing between $195,000 and $277,000, MyStatesman said in another report.Balancing performance and priceConstruction details about houses at Whisper Valley weren’t available, but MyStatesman said that the intent was to seek a market niche where houses with low energy demands also would be affordable. Taurus expects to price the houses between $150,000 and $275,000, which is higher than the median market value for the immediate area around the development but about in line with the median market value in the Austin metro area of $267,000.Taurus partners include Bosch, which will provide energy-efficient kitchen appliances, ground-source heat pumps, and water heaters; Google Nest, which will supply its web-connected thermostats; and Google Fiber.The web report said that Whisper Valley residents will pay a fixed utility fee averaging about $175 per month, which would cover the cost of the rooftop PV, an LED lighting package, the Bosch appliances, ground-source heat pumps, and maintenance costs.In addition to the zero-energy houses, the development also will eventually include townhouses, apartments, and more than 2 million square feet of office and retail space. A second phase of the project, including 200 additional houses, would be started next year.MyStatesman said that Taurus bought 2,062 acres for the project in 2006, but the development was sidetracked by the real estate and financial markets’ meltdown. Later, the company partnered with the City of Austin, which issued bonds to finance highway, water, and sewer lines in the area in return for the company’s help in achieving Austin’s carbon-reduction goals. Solar panels for everyoneAt Colorado Crossing, home buyers will be able to choose between leasing or buying the solar panels on their homes, but they won’t get the chance to say “no thanks.” Every house in the subdivision gets them.The panels will be installed and maintained by SunStreet Energy Group, a Lennar subsidiary, whose CEO said that the decision to include solar was not so much about “being green” as about making sound business decisions.“This is about consumer relevance, this is something they want,” he told MyStatesman. “We have not built this business on being green — this is a real business built on economics and consumer needs.”Homeowners who buy the systems will pay $15,000 upfront, although they will be eligible for the 30% federal investment tax credit through the end of 2016. Home buyers who choose the lease option will pay between $45 and $65 per month, the company said, depending on the size of the system. Homeowners who lease their systems will buy their power from Austin Energy at a discounted price.It wasn’t clear how big the arrays are, and GBA was unable to reach anyone in the company who could offer additional information on solar capacities or mechanical systems. However, David Kaiserman, SunStreet Energy’s CEO, told the website UtilityDive.com that the panels were expected to meet about 60% of each home’s electricity needs. The systems range from 3.18 kW to 5.3 kW in capacity and average 3.45 kW.According to the Colorado Crossing website, the slab-on-grade houses are insulated to R-15 in the walls and between R-22 and R-38 in the ceilings and come with radiant barrier roof decking to reduce attic temperatures. “Technology” features include programmable wi-fi capable thermostats. Houses are heated with gas furnaces and cooled with Lennox 16 SEER air conditioning systems.Lennar is using the IRC’s performance-based method for code compliance. The houses, which are inspected by an independent third party, “consistently” receive HERS scores in the mid-60s, the company said.The company said that 20 homes have been completed, with a total of 35 sold. A second and possibly third phase, with another 120 houses each, are in the works, but Lennar didn’t have a firm timetable for a complete build-out.
I recently asked four score and seven people familiar with green building, “What one change would you make to LEED to encourage green building?”This was a highly unscientific poll that is not representative. The small sample was not randomly selected, but rather each was a professional working on green building projects, so the results have some statistical reliability. And when coupled with the result that more than 50% offered the same solution and the next offered solution polled at only 8%, the responses to this question, posed by someone not associated with the U.S. Green Building Council, are a useful tool.Against a backdrop of the increasing belief, as recently articulated by Bill Gates, that current green technologies can reduce carbon dioxide emissions only at costs that are “beyond astronomical,” many have begun to focus on another frontier of innovation: using energy more efficiently and thus using less of it.Given that many green buildings consume 25% less energy than conventional buildings, green building may be the single greatest current opportunity to address climate change. Q&A with Rick Fedrizzi, President and CEO of USGBCAre LEED-Certified Buildings Energy-Efficient?GBA Encyclopedia: LEED for HomesWhat LEED Credit Is Almost Never Achieved?LEED Can Help Fix the Water ProblemRecent Changes to LEED for Homes — Part 1Why Is the U.S. Green Building Council So Out of Touch? Existing buildings are a worthy targetThere are nearly 4.9 million commercial buildings in the U.S., making existing buildings a target-rich environment. (Each year only about 170,000 new commercial buildings are constructed.) And those existing buildings are tremendous consumers of electricity, accounting for 74% of the total electricity consumption in the U.S. Enlarge the pool of LEED-eligible buildingsWe can now return to the question, “What one change would you make to LEED to encourage green building?” The number one answer, by far, is to allow every existing building that improves its Energy Star score by at least 20% to be LEED EB eligible.That single change to the rating system would make millions of buildings eligible to participate in LEED. This change might be the single greatest current opportunity to address climate change. There are a variety of green building standards, codes, and rating systems, but LEED (Leadership in Energy and Environmental Design) commands more than 95% of the market share in the U.S., so the existing-building challenge may hinge on LEED.The LEED for Existing Buildings: Operations & Maintenance rating system is highly regarded. The system identifies and rewards current best practices and provides an outline for buildings to use less energy and uncover operating inefficiencies.In 2014, LEED for Existing Building Operations and Maintenance was once again the most popular rating system, with existing buildings representing 48% of the total square footage that obtained LEED certification.LEED EB was launched in 2004. The problem is that today there are only a total of 3,778 certified LEED EB buildings. That is less than one tenth of 1% of the 4.9 million existing commercial buildings. Many buildings are excluded by this requirementAn existing building that cannot achieve an Energy Star Portfolio Manager rating of 75 is excluded from participating in LEED v4. (An Energy Star score of 75 means the building is performing better than 75% of similar buildings nationwide.)The prerequisite of a minimum score of 75 arguably excludes 75% of all existing buildings from participating in LEED. This is significantly more stringent than LEED 2009, which required an Energy Star score of 69.There is a pilot credit known as EAp2 Energy Jumpstart. This pilot credit allows an existing building that reduces energy consumption by 20% to be LEED v4 eligible. But, after much internal debate, it was decided that this LEED pilot credit will only be available to the first 500 applicants. Stuart Kaplow is an attorney concentrating in real estate and environmental law. He has served as legal counsel and is past chair of the U.S. Green Building Council Maryland. This post originally was published at Green Building Law Update. RELATED ARTICLES
A lot of discoveries and research work over the past four decades have led to our current understanding of air leakage in buildings. I’ll mention a few here, but I want to focus on one: the MAD AIR paper by John Tooley and Neil Moyer. The full title of the paper was, Mechanical Air Distribution And Interacting Relationships. The first letters of those words spell out MAD AIR.A bit of pressure testing historyIn classes that teach people some basic building science (e.g., BPI Building Analyst, HERS rater), pressure testing is one of the most important topics covered. We teach people how to do blower door tests, duct leakage tests, and a little zonal pressure diagnostics. We show them how duct leakage or closed doors can create pressure differences. And we make the connection between air leakage and comfort, health, durability, and efficiency. But where did all this come from?Home Energy magazine published a nice article on the history of the blower door back in 1995. First developed in Sweden in 1977, the blower door came to America a couple of years later. By 1986, there were 13 companies making this invaluable tool.Also in the late ’70s, Gautam Dutt and his colleagues at Princeton University combined pressure testing with infrared imaging. They called themselves the House Doctors, diagnosing home performance problems and prescribing air sealing remedies. Dutt is also gets credit for discovering the “thermal bypass” while studying heat loss in New Jersey townhouses. RELATED ARTICLESBlower Door BasicsReturn-Air ProblemsAll About Furnaces and Duct SystemsKeeping Ducts IndoorsSealing DuctsResidential CommissioningIs It OK to Close Air Conditioner Vents in Unused Rooms? The 1980s were an exciting time to be doing this research because the new tools gave energy auditors a way to find out what was really going on in our buildings. At first, though, the focus was on air leakage through the building enclosure. They were looking for holes that robbed homes of conditioned air or brought in unwanted unconditioned air.Tooley and Moyer’s MAD AIR researchIn the 1980s, John Tooley and Neil Moyer had a small company called Natural Florida Retrofit, Inc. They were doing pressure testing and fixing homes in central Florida, a hot, humid climate where pretty much everyone has an air conditioner. By the ’80s, those air conditioners were mostly of the central type, with ducts running through attics, garages, and crawl spaces.Tooley and Moyer wrote the MAD AIR paper in 1988, documenting their findings from 371 single family homes. They first found the infiltration rate of the homes (see Image #2, below). Then they looked at what happened to the pressures between inside the home and outside when they did crazy things like turning the air handler on or closing bedroom doors.What they found was that duct leakage can have a big effect on the pressure in a home. They also found that closing bedroom doors can have a big effect. And they correlated these issues with the airtightness of the homes. Here are some of the main takeaways from their paper:Return leakage outside the enclosure causes the house pressure to go positive, which results in conditioned air leaking out.Supply leakage outside the enclosure causes the house pressure to go negative, which results in unconditioned air leaking in.Closing doors to bedrooms that don’t have return vents causes the main part of the house to develop negative pressure and the bedrooms to go positive.If you’ve taken any type of building science class, this should sound familiar. For example, we have sayings for numbers 1 and 2 above: Return leaks blow; supply leaks suck. Now you know where it comes from.One of the most important aspects of their paper was that they saw the implications of these problems. They wrote, “These factors can be major contributors to (1) excessive energy consumption, (2) poor thermal comfort, (3) degradation of building materials and (4) indoor air quality problems (i.e. homeowner health to the possible extent of illness, grave sickness and even death).” They discussed the role of mechanical systems in backdrafting and mold growth, the effects of homeowners closing registers to unused rooms, and more. This was groundbreaking work!Tooley and Moyer weren’t the first ones to do this kind of work. They cite a couple of earlier papers on infiltration and air conditioners. (See page 5 of their paper.) But I believe they may have been the first to put all this together and see how important it was to deal with the mechanical systems and their effect on house pressures and air leakage.At the 2016 Affordable Comfort conference in Austin, I had the honor of moderating a couple of sessions called Insights from Home Performance Veterans. The speakers were Gary Nelson of The Energy Conservatory, David Keefe of Vermont Energy Investment Corporation, and John Proctor of Proctor Engineering. They shared a lot of great stories of those early days when we learned so much about how buildings work, including Tooley and Moyer’s MAD AIR work.Lstiburek takes it furtherAnd then there’s Joe Lstiburek, who had been doing a lot of building science work during the ’80s as well. He pushed the airtight drywall approach, studied backdrafting, and did a lot of other work to understand the building enclosure.The MAD AIR paper by Tooley and Moyer was an important part of his doctoral research. His PhD thesis is called Toward an Understanding and Prediction of Air Flow in Buildings. The first people he thanked in the acknowledgments were his advisors. The second two mentioned were Tooley and Moyer. Here’s how the abstract begins:“This thesis makes two fundamental arguments in the analysis of air flow in buildings:buildings are complex three dimensional air flow networks driven by complex air pressure relationships; andthe key to understanding air flow in buildings is the building air pressure field.”In short, he took what Tooley and Moyer started and turned it into a full academic exposition of pressure and air flow in buildings. He worked through the pressure differences that drive air movement between indoors and outdoors, often going through interstitial spaces (a word he used 127 times in his thesis).We all owe a big debt of gratitude to the building science pioneers who helped lead us to our current understanding of these principles. Allison Bailes of Decatur, Georgia, is a speaker, writer, building science consultant, and the author of the Energy Vanguard Blog. Check out his in-depth course, Mastering Building Science at Heatspring Learning Institute, and follow him on Twitter at @EnergyVanguard.
Tags:#Facebook#privacy#web Facebook has changed the way it seeks permission to share users’ personal information with third-party apps and games, a move that could land the social network in a European court. A German consumer advocacy group is threatening to sue the social network unless it gives users a clear choice on whether they agree to share information.Why Privacy Advocates Are Angry What has provoked the Federation of German Consumer Organizations is Facebook’s decision to drop the “Allow” and “Don’t Allow” buttons used in choosing whether to share personal information with a game or app. Rather than offer the choice in a no-nonsense permissions page, Facebook now shows in the App Center a page with colorful images and a single button that reads, “Play Game.” Under the button is the list in gray font, which could make it seem less important than text in bolder type.Avi Charkham, co-founder of MyPermissions, lists five design elements on Facebook’s new page that he claims play down the fact that in choosing to play a game, people are giving up lots of information. To see exactly what’s being shared with the app developer, users have to roll over a question mark that then shows they are giving up their name, friends list, email, user ID, networks and “any other information you made public.”Getting such information to advertisers is the core of Facebook’s business model – which naturally leads to pressure to push the envelope on privacy. “Facebook will push the concept of consent to the breaking point,” said David Jacobs, consumer protection fellow at the Electronic Privacy Information Center.Has Facebook Crossed The U.S. Line?To the Federation of German Consumer Organizations, Facebook crossed the line with its latest site change. In the U.S., though, that line is blurry. Under a recent Federal Trade Commission order, Facebook is required to first obtain “affirmative, express consent” from a user before sharing information with a third party. In addition, the company has to “clearly and prominently disclose” the information shared.Facebook did not respond to a request for comment. Presumably, it believes that by agreeing to play a game, users are also agreeing to share personal data. Also, rolling over a question mark for details could technically satisfy the disclosure requirement. “Having the user click to use the app probably counts as affirmative consent,” Jacobs said. “But the App Center disclosures might not be clear and prominent.”Pressure On Privacy Won’t Stop Whether or not the recent changes meet U.S. or German requirements, it’s unlikely the company will stop looking for ways to mine the wealth of information it collects from provide. In a recent interview, Facebook engineer Andrew Bosworth, whose mission is to find ways to increase ad dollars from the company’s mobile app, said he is exploring ways to use a smartphone’s audio sensor to trigger relevant ads. For example, if you’re at a Bruce Springsteen concert, then you might be in the mood for buying the artist’s latest album.The site is also investigating the money-making possibilities of sending ads based on a user’s place and time. “Maybe you’re walking past somewhere we know you’ll like and it tells you there’s a deal you can get,” Bosworth told Technology Review, published by the Massachusetts Institute of Technology. “Ads don’t have to be a distraction.”Facebook’s Balancing Act Like other Internet companies, Facebook is fishing for profits in the ocean of user information it gathers. But Facebook is unique in that it has to balance selling the information people think they are sharing only with friends while also respecting their privacy. That’s a high-wire act that’s likely to require constant adjustments to avoid a disastrous fall. A Comprehensive Guide to a Content Audit The Dos and Don’ts of Brand Awareness Videos Facebook is Becoming Less Personal and More Pro… Related Posts Guide to Performing Bulk Email Verification antone gonsalves
A Web Developer’s New Best Friend is the AI Wai… eliot weisberg We’ve got an all new readwriteTV YouTube channel for all our videos, and we’d love for you to subscribe. If you’re subscribed to our SAY Media channel, you’ll get highlights for all our great SAY Media sites, but if you want all ReadWrite, all the time, subscribe to readwriteTV. Top Reasons to Go With Managed WordPress Hosting Why Tech Companies Need Simpler Terms of Servic… 8 Best WordPress Hosting Solutions on the Market Related Posts
3 Areas of Your Business that Need Tech Now IT + Project Management: A Love Affair Settling the latest shareholder suit from Hewlett-Packard’s Autonomy debacle should be easy. All HP has to do is show that it has actually built some marketable software out of the $10.3 billion acquisition. The question is, where is that software?Stanley Morrical isn’t convinced such software exists, so last week he sued HP in Federal court in San Jose, CA, accusing the company of fraud. Morrical is not buying HP’s claims that Autonomy executives duped it into buying the British software maker last year through “serious accounting improprieties, misrepresentation and disclosure failures.” HP has asked US and British regulators to investigate for criminality.HP’s Alleged Cover UpHP says it will take an $8.8 billion write off from the purchase of Autonomy, most of it due to paying too much for Autonomy because of alleged shenanigans with Autonomy’s accounting prior to HP’s acquisition of the UK company. But Morrical says all of these allegations are covering up HP’s incompetence in failing to upgrade Autonomy’s software and release it as a sellable product.“In an effort to conceal their own gross mismanagement, fraudulent conduct and potential exposure to securities claims, HP’s officers and directors have blamed the entirety of the $8.8 billion write-down on accounting issues,” Morrical’s lawsuit says.HP did not respond to a request for comment.The suit’s allegations stem from HP’s announcement in November 2011 that IDOL 10, a major upgrade of Autonomy’s IDOL 7, was ready. In general, IDOL software searches, organizes and manages all data within an enterprise. The upgrade included integration with HP’s data analytics application, acquired that same year with the purchase of Vertica.Where’s IDOL 10?While claiming to have IDOL 10 ready, HP actually had nothing to sell, Morrical is accusing. Essentially, he claims, IDOL 10 was vaporware.“You go out in the market and say it’s available and it’s not,” Aron Liang, an associate at the San Francisco law firm Cotchett Pitre & McCarthy, which is representing Morrical, said. “So either they knew it and they’re lying or they don’t even know what they’re selling, which in some ways may even be worse.”David Schubmehl, a tech analyst for International Data Group, said he was briefed on IDOL 10 in June. However, Schubmehl says he hasn’t talked to any companies using the software.“I can’t confirm that anyone is actually using IDOL 10,” Schubmehl said. “However, I have had briefings about that back in June and it certainly seemed to be part of their big data offerings.”In an interview with the San Jose Mercury News, an HP spokesman declined to comment on the status of IDOL 10.The suit also accuses HP’s leadership of corporate waste and failing to meet their legal obligation to act in the best interest of shareholders.Buying AutonomyHP made the accounting allegations against Autonomy in November of this year, roughly a year after agreeing to buy the software maker. Other tech companies and industry experts have said Autonomy was overpriced.The deal was brokered by HP’s then-CEO Leo Apotheker, who was ousted months later and replaced with ex-eBay Chief Executive Meg Whitman.Whitman, who was on the HP board when the Autonomy deal was approved, takes no responsibility for the purchase and believes HP shouldn’t either. But shareholders aren’t buying it. Others who have filed suits over Autonomy claim they are the real victims and they want their day in court.In the case of Morrical, he also wants to see some real software come out of the deal. If HP has it, then the company shouldn’t have any trouble showing him.Image courtesy of Shutterstock. antone gonsalves Tags:#HP Related Posts Massive Non-Desk Workforce is an Opportunity fo… Cognitive Automation is the Immediate Future of…
A Web Developer’s New Best Friend is the AI Wai… Tags:#Apple#Apple TV#Finance Top Reasons to Go With Managed WordPress Hosting Related Posts Reason 10: Conspiracy Theory TimeApple, along with the rest of the market, is manipulated – by, whom I don’t know: Ben Bernanke, Jamie Dimon and Goldman Sachs, the Rothschilds, aliens from the hollow earth – all definite possibilities. Last Friday, the blogosphere was alight after Apple closed at exactly $500.00 when we all knew a close under $500 would be very bad for Apple, and thus the market, because in the last few years, Apple was the market. But that all changed. If you were watching closely, you might have wondered why AAPL plummeted five bucks in the last moments of trading last Friday. Apparently, some whale dumped 800,000 shares of Apple one second before the market close, and bought an equal dollar amount of ES-Minis (S&P 500 futures).WTF is that? Do you really trust a market where such things are commonplace? Anybody who owns any individual stock these days is taking a helluva risk. (More disclosure: I own some individual stocks.)The Counter ArgumentsThere are two main counter-arguments for buying Apple:One is the fact that it enjoys a low price-to-earnings (P/E) ratio, right round 10. Long story short, P/E ratios are rarely a good indicator of when to buy a stock, and do not necessarily a bargain make. A great time to have bought Apple was in October 2004, when its P/E ratio was 45.The other is that Apple has tons of cash in the bank ($137 billion or so – enough to buy every man, woman and child on the planet the new Nicki Minaj CD and a slice of pizza). Unfortunately, Apple doesn’t have ready access to most of that money, as the company is stashing it overseas for tax reasons.Sure, maybe Apple may have one last ace up its sleeve with the Apple TV, but we won’t know until it finally gets here – if it ever does.To sum up, Apple’s leadership and vision are suspect, its market share, margins and cool factor are shrinking, the stock is technically broken, and it’s over-owned and over-loved. To me, it looks bad, really bad. But I’m so bearish that I’m probably wrong already. Apple could still get to $1,000 next year. But whether it hits new heights or falls to $200, or both, expect plenty of volatility, dead cat bounces and flash crashes along the way.With that in mind, maybe the one thing I should have told my nephew before he bought his five shares Wednesday morning is this: “Nobody knows anything.” On the other hand, as it was his first stock trade, this was probably a wonderful learning opportunity for him. He now knows everything he needs to know about the stock market.Finally, please remember that everything I’ve said is provided for education and informational purposes only, without any express or implied warranty of any kind, including warranties of accuracy, completeness or fitness for any particular purpose. These points are not intended to be and do not constitute financial advice, investment advice, trading advice or any other advice. You should not make any decision, financial, investments, trading or otherwise, based on any of the information presented on this forum without undertaking independent due diligence and consultation with a professional broker or competent financial advisor.Lead image courtesy of bloomua / Shutterstock. Other images courtesy of Shutterstock and ReadWrite. Reason 4: Trees Don’t Grow All The Way To The SkyApple has enjoyed a lovely run, from $6.56 a share on April 17, 2003 to slightly more than $702.10 last September 19, an increase of – wait for it – slightly more than 107,000%. Do you think the stock is going to repeat that growth over the next 12 years? Are you crazy? Did you pass middle-school math?The company’s revenue growth has also been astounding: Apple grossed $156 billion in 2012, vs. $6.2 billion in 2003, or “only” an increase of about 25 times. If it were to repeat that performance, in 10 years Apple’s revenue would be close to the GDP of a major European nation, like the UK, France or Italy. Not likely. Reason 5: Never Catch A Falling KnifeApple was a “generational buy” last November 16th at $522 per share, according to CNBC’s Joe Terranove. I’m not sure how long a generation is – 20 years? But Apple was in the $300s just 15 months ago. Is 15 months a generation? If it was a generational buy at $525, what is it at $450? So maybe a generation is now 10 weeks?Reason 6: There’s Nothing Sadder Than A Momentum Stock That’s Lost MomentumRemember our old friends Krispy Kreme, Soda Stream, Crox, Solar Fun…? Better yet, let’s try to forget them. I’m not comparing Apple the company to those companies, but Apple the stock is all too similar.Reason 7: Apple Is Over-Owned And Over-LovedOne out of every four mutual funds owns Apple, and many have limits on how much of one individual stock they can own. Apple makes up 20% of the Nasdaq 100 (QQQ). Who’s left to buy it? Remember what happened to Sun, Microsoft, Cisco and Qualcomm – other brilliant, innovative, widely owned and loved stocks when the chickens came home to roost? To put things in perspective, Apple’s staggering loss of $70 billion in market capitalization in one day last week was only the third largest one-day drop in history – Microsoft holds first and second place, from back in the Spring of 2000. It was also a can’t-miss company that grew like crazy year after year – until it didn’t. And Microsoft stock is still only about a quarter of what it was at its peak, 12 years later.Reason 8: Technical AnalysisApple’s been carving out a textbook head-and-shoulders chart pattern for a few months now, with left and right shoulders at around $590 per share.This is a very common pattern, so common it holds true only about 50% of the time. But as Apple has become a plaything of hedge funds and algos (over 70% of trading on Wall Street is computer-based these days), it has worked perfectly with various technical analysis methods. Turns out Apple did start to fulfill the head-and-shoulders prediction last week, when it broke the neckline at $500, and according to some, is now perhaps headed to $340. Perhaps not coincidentally, $340 is also a classic 50% Fibonacci retracement of the March 2009 to September 2012 climb from $80 to $700.Some other technical analyses to keep in mind: Apple made a “death cross” last December 10, meaning the 50-day-moving average dipped below the 200-day-moving average. The stock is now trading well below both its 50-day and 200-day moving averages. As Apple started its descent in September, the volume picked up on selling, and was light on the uptick days; those mini-rallies were probably due to short covering. This is not a pretty picture.Finally, note that the stock went vertical at $425, and is destined to return there, according to Jeffrey Gundlach, perhaps the world’s most successful bond trader. But if you look at the chart, there’s some gaps around $375, and the initial take-off was on very light volume. Hard to imagine a worse technical setup.Reason 9: The Market Was Going Up While Apple Was Going DownMost damning of all, perhaps, is the fact that the market went up the last two days of last week. Apple couldn’t even manage a dead-cat bounce on short-sellers covering their positions back up to $500, even on a fairly positive day, with good economic news. Imagine if the market in general was tanking, how bad this would be? In general, it’s a good idea to stay away from stocks that show red while everybody else is green. My nephew called me the other day, the morning before Apple released it fourth quarter earnings last week, and asked me if he should buy Apple stock. His “friend,” a stock broker, said he couldn’t think of “one good reason why not to buy the stock.”I didn’t get a chance to speak with him before he pulled the trigger; My nephew got in at $510 per share or thereabouts. But if I had, I would have shared these 10 good reasons not to buy Apple stock – and they’re just as true after that post-earnings collapse.(Disclosure: I’m a long-time Apple fanboy, I currently have four MacBooks, an iMac, an iPad, 2 iPhones and I don’t know how many iPods in my house. I still think the Apple IIci is the greatest computer ever made.) Reason 1: As You May Have Heard, Steve Jobs Passed AwayAlthough most often kindly thought of by the general populace as a brilliant innovator and visionary, Jobs was also “kind of a dick” – which is, of course, the best kind of guy to run a gigantic multinational corporation. Tim Cook was a great CFO, but may not be sociopathic enough to be a great CEO – and he doesn’t emit a reality distortion field. Apple leadership also took a big hit when Cook fired Scott Forstall, Apple’s most prolific inventor. Which brings us to…Reason 2: Apple Is No Longer InnovatingRemember how crazy people went over the iPhone when it first came out? Over the iPad in 2010? Those products were “beautiful” and “revolutionary” and cool as hell. Now we’ve got the iPad Mini and the iPhone 5 – and the reaction is… not so much. As far as status symbols go, the iPhone is now sold at a discount at WalMart.Reason 3: Apple Now Has Real Smartphone CompetitionYou might disagree, and will let go of your iPhone only when somebody pries it from your cold, dead fingers. But lots of people love those Samsung phones with the big-ass screens, not to mention the Google Nexus 4 – if the Nexus supply wasn’t constrained, Apple might have sold evenfewer iPhones, and iPhone sales account for an incredible two-thirds of the company’s profit. bernard meisler Why Tech Companies Need Simpler Terms of Servic… 8 Best WordPress Hosting Solutions on the Market
What it Takes to Build a Highly Secure FinTech … Tags:#AT&T#Carriers Role of Mobile App Analytics In-App Engagement Related Posts We’ve all been there. You are at a crowded event, taking pictures with your smartphone and trying to share them on Facebook or Twitter but the damn photo just won’t upload. The data network you are using is clogged because too many people are trying to do the exact same thing you are: making phone calls, sending texts, uploading pictures and streaming video. With everybody trying to do the same thing at the same time, nobody can actually do anything.The Big Arena ProblemLike other mobile carriers, AT&T has been working on solving this “big arena” problem for a while. It has been working to improve its DAS – Distributed Antenna System – for several years and has been moving to place it in venues across the country, including Lucas Oil Stadium in Indianapolis and the TD Garden in Boston. For the Super Bowl at the Mercedes-Benz Superdome Sunday night, AT&T had its DAS system up and running with 11 temporary COWs (Cells On Wheels) towers to help the data flow.So, when the lights went down on the Super Bowl, fans were still theoretically able to make phone calls and tweets to their heart’s delight.Blackout Boosted Data UsageAccording to AT&T, the busiest data traffic came during the halftime show when Beyonce was performing – and during the blackout itself. AT&T users consumed 78GB of data on the in-stadium network during the hour, more than twice what they did during the busiest hour of last year’s Super Bowl in Indianapolis. During the 34-minute power outage, AT&T users sent twice as many texts, consumed 10GB of data and made more phone calls than they did at any other hour during the game. This makes perfect sense considering that the users were living in the middle of a live news event and had nothing to do but putz around on their phones while the stadium lights returned. Total data usage for the entire event was 388GB on the in-stadium network. That is a lot of photos, even for the 71,024 officially in attendance at the game (many of which were likely not AT&T users). That was an 80% increase in traffic from the Super Bowl in Indianapolis. In addition, AT&T users made more than 73,000 calls during the Super Bowl. Even if everybody in the stadium was an AT&T user, that is more than one call per person. In likelihood, it was probably more like three or four per AT&T user.Of course, many of these numbers are artificially high because of the extended delay in the middle of the game. Judging data from the Super Bowl or any other singular event can lead to suspect conclusions. But if there is one thing to take away from the Super Bowl in New Orleans in relation to AT&T’s cellular traffic – it is that more and more people have smartphones and they are using them more and more. This was the dominant theme of mobile in 2012 and it continues into 2013 and beyond. Top image: Mercedes-Benz Superdome, New Orleans, Louisiana, courtesy Wikipedia. dan rowinski Why IoT Apps are Eating Device Interfaces The Rise and Rise of Mobile Payment Technology
Related Posts A Comprehensive Guide to a Content Audit Facebook is Becoming Less Personal and More Pro… selena larson Tags:#Companion App#death of Twitter#Dick Costolo#earnings#town square app#twitter#twitter companion#twitter town square Reports of Twitter’s death have been greatly exaggerated. Yes, the social network is experiencing slow growth. Sure, Twitter is implementing a slew of changes that will transform it into a website very different from the text-based social network we’ve come to love. And maybe you’ve gotten bored with it. That doesn’t mean we’ll be attending its funeral any time soon. Early adopters will no doubt decry Twitter’s evolution—and I’m one of them. I’m not a fan of the new Twitter that copies features from Facebook with abandon, and I’m definitely not alone. People who have used the service for years have become accustomed to the way it looks and operates; we’ve become the Twitter elite that gets how Twitter works, with all the silly hashtags and Twitter canoes, and we don’t want more people coming in to rock the boat. See Also: Why Twitter’s Facebook Obsession Is UnhealthyThe thing is, Twitter can’t be considered a dead social network until it has time to live among the masses. And to appeal to a larger audience—one that isn’t just tech bloggers, media, early adopters and their ilk—it needs to change.Twitter, as we know it, might be dying. But much like a caterpillar turning into a butterfly, Twitter needs to experience radical change before it can really fly. Not A Town Crier, But A Friendly CompanionTwitter CEO Dick Costolo has historically referred to his social network as a “town square,” with millions of people sharing news and events with each other in 140-character spurts in real time. But Costolo dropped his metaphor during Twitter’s first quarter earnings call on Tuesday.“We think of Twitter as this companion experience to what’s happening in the world,” he said.Twitter itself is acknowledging the changes. It’s come to realize the “town square” metaphor doesn’t resonate with the masses, and it needs to reposition itself as an accompaniment to, rather than an authority on, what’s happening around its users.Twitter as a companion service means that people don’t necessarily have to tweet or contribute all the time just to enjoy the greater community that solely exists on Twitter.The company’s move to become the most popular “second screen” experience is a perfect example. Twitter wants to be the application everyone is using while watching television, but that doesn’t necessarily mean people must tweet simultaneously. Sometimes just following their favorite celebrities’ statuses or reading hashtag threads will be enough.For instance, on Monday’s “The Voice,” banter between coaches Blake Shelton and Adam Levine found its way to Twitter. Shelton tweeted rival coach Levine’s cell phone number, which was retweeted almost 40,000 times. As a fan of “The Voice,” watching the duo tease each other without being privy to it firsthand might produce a bit of FOMO—or fear of missing out—and could prompt new Twitter users to sign up just to take part in the fun.Twitter also announced Tuesday it has grown to 255 million monthly active users, up from 241 million last quarter. Still, investors don’t feel Twitter is growing fast enough: Those growth numbers fell below analyst expectations, and as a result, Twitter shares fell shortly after the company released its earnings.An Expected ShiftIndeed, Twitter has a slow growth problem, but it’s not for lack of awareness. Twitter is unavoidable: Tweets are embedded on news outlets around the world, broadcasters read tweets while calling sporting events, and it’s almost impossible to watch live television without seeing an advertisement incorporate a hashtag or an @-mention. People are aware of Twitter, they just don’t know how—or why—they should use it.The company has made significant changes to its core product in an effort attract a broader audience and boost user growth. Most notably, the company completely redesigned user profiles by ripping off a more user-friendly service—Facebook. The Facebookification of Twitter certainly has its downsides—we don’t want another place for friends. But as its slow growth demonstrates, Twitter, as it is right now, isn’t enough. Twitter also hinted at more tweaks to its direct message product, a feature that has seen its own share of updates in recent months. A more robust messaging service that complements its companion app strategy will hopefully encourage even more people to use the application.Try as it might to convince users otherwise, Twitter still faces an identity problem. It’s struggling to become a must-have application for everyone, while those of us who rely on it for news and events are slowly becoming dissatisfied with the way it seems to be diluting itself to appeal to a broader audience.Twitter is taking a risk—it’s making changes to get more people on the service that alienate the people that helped build it up in the first place. It’s a risk Twitter is willing to take, because getting the next 255 million people on Twitter is worth making a few dedicated users very unhappy.Lead image courtesy of NYSE The Dos and Don’ts of Brand Awareness Videos Guide to Performing Bulk Email Verification
IT Trends of the Future That Are Worth Paying A… If you envision the Chicago of the future, complete with sprawling skyscrapers, delivery drones, and automated cars cruising down brightly-lit streets, you probably aren’t imagining the same city as some of Chicago’s government officials. In recently-proposed legislation, the city of Chicago may well ban automated vehicles entirely.Aldermen Ed Burke and Anthony Beale proposed the ordinance in a city council meeting this past Friday. In a report from the Chicago Tribune, the penalty for operating a driverless car without a human behind the wheel is a $500 fine. This ordinance would apply to every street within Chicago’s city limits.“We do not want the streets of Chicago to be used as an experiment that will no doubt come with its share of risks, especially for pedestrians,” Burke said in a press release from the city’s Committee on Finance. “No technology is one-hundred percent safe.”See also: Tesla upgrades autopilot — but you still need to pay attentionThis statement is in stark contrast to the data currently available on autonomous and self-driving vehicles which indicate that they are, even in their infancy, as safe or safer than human drivers. Google, one of the pioneers in self-driving technology, has operated autonomous vehicles in multiple states with over 1.9 million miles of distance covered since 2009. During that time, its vehicles have only been found at fault in a single accident where the car side-swiped a bus to avoid debris which were blocking its path in the road.While Google’s self-driving program has been in other accidents, including one being hit by a drunk driver recently in Chandler, AZ. This is one of over a dozen cases where human error either on the part of a human operator in the Google vehicle or in another vehicle were found at fault.There have been reports of accidents involving Tesla’s Autopilot system, an in-beta self-driving system available in many Tesla vehicles. This system is not currently advertised or intended to serve as an autonomous technology. Instead, it depends on a human driver being present, mindful, and ready to take the wheel at any moment. There was a fatality reported that involved a Tesla this year, in which a brightly-lit sky masked the white side of a semi trailer that was blocking the road. Neither the driver nor the car was able to detect its presence in time to stop.Chicago would be the first city to banIf this legislation passes, Chicago will be among the first cities in the United States to ban autonomous vehicles outright. As more cities take this stand, a future where you could send your car across the country or even city-to-city becomes exceedingly more difficult to imagine.The news isn’t all bad for autonomous car fans. Pittsburgh has become the testing ground for Uber’s autonomous vehicle project, which it hopes will one day enable autonomous vehicles to drive to, pick up, and drop off passengers without the need of a human driver behind the wheel. While there certainly is a place in our visions for the future of an autonomous roadway filled with fast-moving cars without steering wheels, we’re not going to get there without convincing a few naysayers along the way. How IoT Will Play an Important Role in Traffic … Related Posts How Connected Communities Can Bolster Your Busi… Tags:#autonomous cars#Chicago#Internet of Things#IoT#Self-Driving Ryan Matthew Pierson Surveillance at the Heart of Smart Cities
How Connected Communities Can Bolster Your Busi… Donal Power Related Posts Tags:#IoT#Lux#Singapore#Smart Cities#smart lighting A telltale clue that you’re in a smart city is prominent LED lighting infrastructure, and five of the smartest global cities are absolutely festooned with futuristic lampposts.Lux Review highlighted which cities are the global leaders when it comes to integrating advanced lighting and other smart technologies into the urban environment.Lux crowned the Spanish metropolis of Barcelona as is top smart city, largely because of its lighting.Connected technology fitted to lampposts allows city computers two measure traffic, crowds and pollution. And following a severe drought a few years back, Barcelona’s new streetlights were also rigged to serve as weather monitors.The streetlight’s onboard sensors track irrigation levels and rainfall help the city more efficiently run municipal sprinkler systems.Next up is San Francisco, which Lux noted has been a leading smart city for years, partially due to its proximity to Silicon Valley.The city recently announced that it would replace 18.500 light-pressure sodium street light fixtures with smart LEDs. Wireless smart controllers will allow the city to use the new LED fixtures to remotely monitor light performance and warn the city when each light burns out, which will increase safety and save money.And these lights will be the greenest street lighting in the state as they will be powered with 100% clean energy.Copenhagen is another smart city that is leading by lighting, and aims to become carbon neutral by 2025.Nearly half of the Danish city’s older street lights were recently replaced with LED versions which brighten when cars draw near, and dim after they pass.These light fixtures also contain sensors that capture data to improve city services like municipal waste collection.LA has seen big savings alreadyNext is Singapore which has launched an ambitious “smart nation” strategy that has seen the Asian city-state wired up with huge numbers of wireless sensors that track everything from garbage cans to traffic.In the area of lighting, Singapore has partnered with Philips Lighting to develop a connected streetlight management system. And the government has also teamed up with Scottish visible light communication firm pureLiFi to bring this revolutionary technology to Singapore.Lastly, Lux cited Los Angeles as a smart city light leader because of its plan to convert more than 200,000 old school streetlights into smart LED fixtures. With the project only 80% complete, LA has already saved $9 million and seen reductions in crime thanks to better lighting.The new smart poles have wireless capability and improve phone reception. They also can monitor for lighting outages, parking availability and listen for car crashes, reporting this information back to the city. How IoT Will Play an Important Role in Traffic … Surveillance at the Heart of Smart Cities For Self-Driving Systems, Infrastructure and In…
On first hearing the term ‘image analysis’, it’s easy to think immediately of software that can identify objects in photographs. But while static image analysis is a vital application of computer vision, and one that offers immense value, it’s only a part of a bigger picture.For marketers in particular, some of the most compelling use cases are centered on analyzing the living and breathing consumer, in order to improve marketing performance (and its measurement), create stronger personal connections between customers and brands, and reduce costs.This article explores some marketing use cases for real-time visual analysis of people, along with the ones that involve analyzing photographic images. Each of these use cases offers direct benefits to brand owners and creative agencies by way of improved performance measurement, reduced costs, and the creation of more personalized, engaging experiences for consumers.Making Marketing More PersonalComputer vision and image analysis technologies may have a way to go before they can even come close to the human levels of visual perception, but a fair degree of accuracy is already possible in certain scenarios, such as visual analysis of gender and age groups. That’s a level of precision sufficient to make new inroads into marketing personalization.The McDonalds burger empire is just one enterprise investigating this marketing approach. The fast-food giant has plans to open self-service kiosks equipped with cameras and image analysis software. The kiosks will use computer vision to identify customers by their gender and approximate age and then recommend menu items based on this visual data.Personalized Window ShoppingAside from McDonalds, other retail companies are working with computer vision consultants and developers to increase personalization through image analysis. Camera-equipped screens in storefronts will soon be capturing and analyzing data about passers-by—or perhaps more specifically, window-shoppers.When a window shopper stops to check out the screen displays, she will see the content change to show the items or products that the software perceives as relevant to that individual based on the analyzed visual data. In an apparel store, for example, a window-shopper may see images of garments that complement what she is wearing. Could there be any better way of grabbing window shoppers’ attention and turning them into impulse buyers?Understanding Consumer Attention and SentimentBrand owners will soon be able to use image analysis to capture improved metrics about how advertising is perceived by viewers. Again, while the technology may not yet be sensitive enough to recognize all the subtleties of facial expressions and body language, computer vision can certainly discern between a smile, a blank expression or a frown.This type of analysis can be a game-changer for enterprises wishing to assess TV and video advertising effectiveness. The conventional way to gather such information is via surveys, in which ad viewers express their sentiments about the content they see. The weakness of this approach though, is simply that humans are notoriously unreliable when it comes to articulating their impressions.Smiles Don’t LieThe reality of survey fallibility was evidenced when one particular advertisement, the subject of a 2016 study by Omnicom Media Group, was ranked 55th out of 63 advertisements on the 2016 USA Today Ad Meter List. That result was based on viewers’ stated opinions. However, the ad was then shown to a sample of viewers, whose emotional responses were measured using facial image analysis. The findings belied the USA Today ranking, as the ad raised more smiles among the viewers than did other productions ranked much higher on the Ad Meter list.The takeaway from this study is that the capture of viewers’ spontaneous emotional reactions can provide more accurate data than self-reporting, at least when it comes to gauging the impact of audio-visual advertising. While it may not yet be clear how and where to utilize image analysis for this purpose, the possibilities are being taken seriously and it’s likely that the technology will soon become an important source of marketing performance data.Accurate Analysis for Outdoor AdvertisingOf all traditional methods of advertising, outdoor ads represent the only format still enjoying growth in terms of spending. In 2017, out-of-home advertising sales rose by 2.7%, according to the data from the Outdoor Advertising Association of America.The gradual transformation of outdoor signage from static to dynamic digital displays is with no doubt helping to maintain public interest in this advertising format. The only problem with signs and billboards though, is the absence of clicks that can be counted to measure advertising performance.All-Seeing SignageIndeed, online marketing offers an infinitely greater degree of performance feedback than outdoor advertising, but image analysis could be the breakthrough to level this particular playing field. Small, inexpensive cameras mounted on or close to an outdoor signage, integrated with image analysis applications, could become the source of feedback on a range of performance factors, including:How many people walked past the ad signageHow many people stopped to look at the advertisementHow long people spent looking at the advertisementDemographics (such as age and gender) of people who stopped to look at the signageAccording to a Marketing Land article, this outdoor equivalent of tracking click-through rates has already been deployed in tests by creative agencies such as M&C Saatchi, who believe it can support advertisers in making outdoor signage even more dynamic, perhaps enabling them to make real-time changes in the displayed content based on audience response.No Time-Consuming TaggingAnother benefit that marketing teams might receive from image analysis is a reduction in the workload associated with displaying products on ecommerce websites. It currently takes plenty of effort to publish product images, since every image must be made identifiable by means of tags.For many ecommerce enterprises, image tagging can add up to a substantial amount of time and therefore labor costs. It can also present semantic challenges for consumers when they search for products. By enabling visual search tools based on the deep learning technology, digital marketers will no longer need to assign text-based tags manually to product images, and consumers will no longer need to struggle to find the right words for their searches.Seeing Beyond SemanticsAs image analysis programs can scan images and identify not only the contents but also the context, visual searches powered by image analysis will allow online shoppers to search for products by selecting or uploading images similar to the items they wish to find. This is especially useful for products with subjective descriptions, which might be labeled with any out of a range of tags, depending on the jargon used by the merchant.Ultimately then, this particular use case of image analysis will offer the double benefit of making searches easier for shoppers to conduct, and reducing the costs involved in creating and maintaining online product catalogs.Image Analysis: A New Dimension in Data-Driven MarketingThe image analysis use cases described in this article are exciting. They may also be but a few of the more obvious possibilities awaiting marketers who set foot into this new dimension of physical/digital interaction.In the same way that technologies like augmented reality, sensors, and beacons have blurred the lines between ecommerce and traditional retail, so cameras and image analysis may now play a similar role in merging digital and traditional forms of advertising media (which, let’s face it, are largely visual) to create a new landscape for data-driven marketing.Time will tell if its influence will be that revolutionary, but it does seem likely that image analysis will reshape the way brands strengthen their identities, heighten consumer awareness, and attract new followers—all while measuring performance more accurately and reaching new levels of cost efficiency. Why IoT Apps are Eating Device Interfaces Tags:#Analysis Milos Mudric Related Posts Milos Mudric is a content specialist and tech enthusiast. He is the founder of Silver Fox Digital and SEObrainiac.com and he occasionally writes interesting stories about Blockchain, IoT and Fintech. What it Takes to Build a Highly Secure FinTech … AI is Not the Holy Grail of Sales, at Least Not… Follow the Puck
This post was written by Kacy Mixon, M.S., LMFT, Social Media Specialist. She is a member of the MFLN Family Development (FD) team which aims to support the development of professionals working with military families. Find out more about the Military Families Learning Network FD concentration on our website, on Facebook, on Twitter, You Tube, and on LinkedIn. By Kacy Mixon, M.S., LMFTAbout Family Development: [Flickr, klanggabe by Leo Grubler, CC BY-ND 2.0] Retrieved on September 17, 2015Family Development concentration provides opportunities for military service professionals to engage in an online community of resources and information that supports work related to family violence. Our purpose is to provide the latest research, continuing education materials, and training opportunities related to the prevention, intervention and treatment of family violence for military families. Our target audience includes professionals working within the Family Advocacy Program and other mental health professionals who work with military families.Mixon, K. (2013). Kacy Mixon gives eXtension.org permission to use her personal photo.Here you can find resources related to the important work that military service professionals engage in. You will also find opportunities to LEARN about professional development, ASK questions related to your work with families, and SHARE your ideas.
Jay Morse & Heidi Radunovich, PhDUsing smartphones and other mobile communications devices have become a way of life for many of us. As of 2010, there were over 7.000 health applications for mobile devices – and the list keeps growing. Dr. Shore and colleagues have cataloged and prioritized applications for mental health for the military and summarized three leading military mental health projects using mobile technologies .[Flickr, Operation Lone Star by Texas Military Forces, CC BY-ND 2.0] Retrieved on September 17, 2015Mobile Health, or mHealth (using mobile communication devices for health care services), can improve traditional mental health practices by enhancing communication, enriching health information, encouraging engagement, and improving compliance. Mobile technology can be used easily on-base or in the civilian community, is easily accessible (can be carried in a pocket, purse, or backpack), and can provide patient physiological data as well as voice and text communications. A wide range of mHealth applications for the military have been developed or are in development. Some of the projects being developed include:Remote Exercises for Learning Anger and Excitation Management (RELAX): This application collects self-reported information about the emotions of the user and physiological information that is reported to a therapist to assist in therapist-directed feedback to address anger and stress.Remote PTSD monitoring and diagnosis using an automated system: The application uses voice analysis software to screen and identify individuals at risk for PTSD.A Conversational Independent Living Assistant for Cognitive Impairments: This project extends the current Planning and Execution Assistant Trainer (PEAT) to help users in the VA system to plan, execute, and monitor daily activities. The application is planned to have a virtual caregiver who interacts with the user.Naturalistic Neurocognitive Assessment: A video game for smartphones, the application assesses increasingly complex neurocognitive metrics.While there are many opportunities to develop innovative mobile technology solutions, there is a limited base of mental health literature evaluating outcomes when using these devices. Still in its infancy, the field of mobile technology and of mHealth is fast moving and provides many possibilities for uses in mental health for care providers in the future.References Shore, J. H., Aldag, M., McVeigh, F. L., Hoover, R. L., Ciulla, R., & Fisher, A. (2014). Review of mobile health technology for military mental health. Military Medicine, 179(8), 865-878. doi:10.7205/MILMED-D-13-00429This post was written by Jay Morse & Heidi Radunovich, PhD, members of the MFLN Family Development (FD) team which aims to support the development of professionals working with military families. Find out more about the Military Families Learning Network FD concentration on our website, on Facebook, on Twitter, YouTube, and on LinkedIn.