- International Games Innovation Conference 2011 at Chapman University, the City of Orange, California, Executive Chair – Dr. Narisa N. Y. Chu. This is the 3rd IEEE Conference Sponsored by the Consumer Electronics Society, funded by Rambus Incorporated.
- IGIC 2011 Honored Trip Hawkins for his Innovations in digital games, November 2, 2011
- IGIC 2011 Conference Report
- Men versus Machines’ Intelligence in the world of Question-Answer Games! - May 21, 2011.
- Toward Blood Image Filtering against Internet Spam - February 17, 2011
- Energy Consumption Metrics for Plug-in Consumer Electronics on the Smart Grid – May 5, 2010
- Smart Broadband Service Architecture – May 2, 2010
- Critical System Resolution for High Definition Satellite Set-Top Box Deployment (excerpt) – November 15, 2008
Men versus Machines’ Intelligence in the world of Question-Answer Games! - April 28, 2011.
IBM’s Watson entered the Jeopardy Competition with Ken Jennings and Brad Rutter on American Public Television – Inspirations for the IEEE CE Society 3rd IGIC 2011
IBM has devoted a team of 25 computer scientists and engineers, of the sharpest caliber, for the last 4 years, in building a bionic Jeopardy player, Watson, with 2,880 processing cores. Watson beat the best Jeopardy players in history: Ken Jennings and Brad Rutter, on the American CBS Jeopardy Show broadcast from February 14 to 16, 2011, with a score of $77,147, more than 3 times of Jennings’ $24,000 and Rutter’s $21,600. Based on massive statistical analysis which attempted to overcome human’s perceptual, conceptual and symbolic ingenuities, Watson has demonstrated natural language capability, skillful betting strategy, free of profanity to act like an impressive, knowledgeable gentleman on stage. The public did not unanimously endorse Watson, recognizing its shortage on the conceptuality, creativity, and its obvious deafness and emotion-less, impervious to nerves and psychological maneuver. Yet, fears of Watson’s intelligence and its impact on future jobs prevailed. It is easy to overlook the upside even if the outcome of Watson is to free us from mundane tasks, to do more of what we love to: singing, surfing, admiring,.… and to be equipped beyond an individual’s intellectual limitation in creating miracles.
This historical Jeopardy Game Show of men versus machine appeared incredibly timely to lend some thoughts to the upcoming International Games Innovation Conference, setting foot in Southern California on November, 2 – 4, 2011. While 14 million viewers glued to the Jeopardy Show for those 3 days in February, immediate Internet blog comments fell into 4 major categories:
- The precise buzzer speed – Watson had distinctive advantage playing against the rule of premature buzzing penalty (1/4 second) and triggered many firsts to grab the podium for answers.
- The smart game strategy – hustling for Daily Double and provoking unusual wagers.
- Playing the fuzzy, best-guess scenario with correct answers sometimes at confidence level as low as 30%.
- The laughable mistake – “Toronto” picked as a U.S. city – perhaps this is the most touted consolation to human’s jealousy over the bully Watson’s victory. But this mistake only cost Watson a mere $947. (What a smart dude!)
Written comments from Ken Jennings for the final Jeopardy drew high remarks of his sportsmanship: “I for one welcome our new computer overlords.” As a contrast, Watson has no humor, no sportsmanship when it was comfortably on a winning streak beating the best Jeopardy human players by a factor of 3. (A factor of 1 would be enough, in author’s mind.)
Some imminent questions:
- Cosmetics and esthetics: How was Watson’s physical appearance received by the audience, particularly when it gained the ultimate winner’s position?
- Could it be less intimidating when it defeated its human opponents if Watson were presented with a heart, expressed with emotion?
- How many human brains could Watson equate?
- Was this competition with Watson fair to Ken Jennings and Brad Rutter? What was the entertainment value?
- Could Watson’s memory and buzzer speed be throttled to foster a fairer game in the public eye?
- How could Watson be judged in grabbing the signal device, as it demonstrated an obvious advantage over the legendary quick thumb of Brad Rutter’s?
- What would be other applications that could utilize Watson’s instant buzzer capability?
- Search algorithms: fair to be compared to Google, or not? – Watson, in essence, had an Internet within itself, contrary to the typical Google’s “search” in terms of depth and breadth. How could this competitive “search” mechanism compared to others, including Google’s? How many Google equivalent “searches” would be rendered per Watson “search”, and vice versa? What would be the relative time for Watson to “search” for an answer compared to using Google? (It has been stated: unlike the most advanced Internet search engines, which could only find results for specific requests, Watson could make connections between words and determine a logical answer from imputed data.1)
- More about User Interface – Watson would answer, with its buzzer capturing the majority opportunities, no matter what the confidence level of its answers. It was shown picking answers with confidence level at 30% range. Should voice recognition be used for Watson instead of electronically typing questions into the computer?
- Watson’s method to calculate wagers and bets was deliberated intuitively with unusual precision. Surprises as such might be the best entertainment from Watson.
- Gender? The machine is a male!
Furthermore, the back room size and the electricity used by Watson seem only to be affordable by IBM, the US Government, and like big conglomerations. Watson is destined for Hospital and Call Center applications after Jeopardy, so announced for future good causes and business incentives. No doubt has Watson demonstrated a historical milestone in terms of memory and inference capability.
Introducing Watson to medical applications was an idea from the IBM leader, Dr. David Ferrucci, based on his trauma of sternocleidomastoid. Cases of medical applications are plentiful, for example, who else would want Watson more than an Alzheimer’s patient with memory deterioration? There are 15 million Alzheimer’s patients in the US, with average 1 ~ 3 associated care takers per patient, all enduring constant crisis and prolonged tragedy of death. Reprogramming Watson to cater to an individual’s memory and environment should be a relatively trivia task after Jeopardy, if only the physical Watson can be shrunk to a size manageable by consumers at an affordable price.
Could pieces of Watson technology wind up in different scales of consumer electronics, just like the evolution from mainframe to personal computer? furthermore, reaching to the inner city failed school systems? Extensions of Watson beyond Jeopardy could be indeed boundless, plentiful, promising areas of research and development for the consumer electronics industry to explore.
Watson versus Jeopardy Champions: the Show is over, but questions linger: What is a game and what is not? What is in us and what is in computers? “At the end, each of us will calibrate our own blends of intelligence and creativity, looking for more help, as the years pass, from ever-smarter computers.”1 These and numerous other topics with views of pros and cons, are solicited for the 3rd International Games Innovation Conference. Please access http://ice-gic.ieee-cesoc.org.
- Stephen Baker, Final Jeopardy, Man vs. Machine and the Quest to Know Everything, Houghton Mifflin Harcourt, Boston, New York, 2011
Toward Blood Image Filtering against Internet Spam – February 17, 2011
Abstract— Information Integrity is often hampered by Internet spam and un-coordinated requirements due to the diversity of people, policy and technology. Latest increase in Facebook exchange has manifested a further need for image based filtering, for example, in detection of violence attacks with the middle Eastern unrests and far East uprising prosperity. A huge challenge from the technology perspective, our first step is to recognize blood in images that can reflect signs of violence. A few techniques have been explored and results discussed to tackle challenges that might help to link attacks of unknown sources and anticipation of new ground rules and architectures.
Internet spam has been reported to consume between 50% and 80% of the e-mail bandwidth. Network facility and server storage have been wasted to deliver unwelcoming and often vicious messages, some threat national security. E-mail spam has become notoriously rampant even after considerate investment in the control of viruses, worms, trojans, botnets, zombies, spyware, etc., over the last 15 years, all under the name of computer malware that traverses through the Internet in every new way. Email Spam is not as destructive as the others, most come with goals that can often be just annoying to the recipient. It has been used as a marketing tool by many businesses, also becoming a propaganda tool with questionable motivations and results. It carries implication that particularly troublesome in many Asian and Middle Eastern political environments. The purpose of this paper is to build tools to filter Internet political oriented spam with image recognition techniques, starting with the task in recognition of blood.
Our effort in analysis of Internet security takes into account of policy, process and technology. This all-encompassing approach is necessary in light of the latest development about Google’s retreat from China and bans on Blackberry from India, Saudi Arabia and other “conservative” countries. We seek for technical solutions that have broader applications, also to acknowledge political and ethnic requirements.
Categorization of Spams
Spam has become rampant because of its high intensity and ease of penetration. Categorization of spam becomes necessary to effectively prevent and manage Internet security. Because of contentious understanding of democracy and its inherent sensitivity, one area that has not gained adequate public discussion within the technical community is in treating political censorship and its granularity, from democratic to totalitarian regions. With potentially adverse impact on economy and/or stability, the cyber detection and resolution of violent spam have been rather sparse. Typically, these issues touch on various policy requirements that were felt outside the control of the software security manufacturers or service suppliers. Misunderstanding exists, and often times carries condemnation. It is the purpose of this paper to contrast various efforts in violence control and explore solutions toward better prevention of much undesirable political spam.
A snap shot of spam profiling with regional views highlights the political spam which is not particularly cared for in a democratic country, such as the USA.
However, in a totalitarian country, such as China, there exists a gain with proper detection, rather than blind neglect or over-reaction . In general, it is observed that the nature of the spam, reflecting cheating, distraction and commercial exploitation, is more or less the same between a democratic and a totalitarian society, thus many existing tools against malware are employed universally throughout the world. However, for many conservative countries, including China and many Muslim countries, political oriented spam is more intensely monitored, even when the percentage might be less than 10% (marked in orange color in Table 1) with lacking tools for resolution. In contrast, in the western world, this type of spam is not discussed as much, perhaps under the banner of freedom of speech. In both cases, violence indication is often left as an afterthought or to be regretted after the effect.
Spam management in china
To tackle the political spam, no place is better than China to start with based on its vast population and Internet prevalence. We thus review spam management condition in China. Map of China’s e-mail traffic shows inbound and outbound spam intensities.
Three biggest clusters show spam traffic concentrating in Beijing, Shanghai and Guangzhou (near Hong Kong), all three are economic power zones in China.
The Chinese government continues finding more efficient methods and solutions to keep spamming from wide spread. However, its control of the political spam has upset many, both domestically and internationally. In fact, the Chinese spam management reflects quite orchestrated policy, be it universally acceptable or not. Extensive reporting, public awareness, and technology reinforcement have been promoted impressively by the central government. However, the political spam remains thorny in both detection and treatment.
This paper acknowledges its existence and challenge for resolution of politically oriented spam. The common anti-spam techniques are first reviewed, which leads identification of violent related images.
Anti-Span filters evolve over the years to catch up with the ever harder-to-detect spam techniques. Text oriented techniques assisted with database and artificial intelligence inference engine come short with the latest rise of photo centric and video communications, e.g., Facebook. We examine images in the interest of isolating political violence. We select recognition of blood as one of the in-roads to recognize un-wanted political spam. This represents a baby step toward meaningful recognition of spam related to any instability forces. However, the examination leads to useful techniques that might also help with medical applications. It is the desire of this endeavor to first expose the need, and then to broaden the recognition technology and application.
Image filtering of social networking spam containing Violence
Spam filtering has to go beyond text, with the latest social networking spam, focused image processing becomes necessary. Image processing is memory and processing intensive. We investigate with 5 procedures for recognition of blood. The focus on blood is believed to be linked with many violence scenes, although violence can go beyond having just blood in its content or not at all. The recognition of blood represents only the first step toward violence recognition. The benefit of blood recognition is that it can have many other applications as well.
The 5 procedures we used in our programs consist of:
(1) color composition,
(2) shape analysis,
(3) association with skin,
(4) overwhelming size, and
(5) other boundary conditions.
Use of all 5 procedures is necessary to avoid false positives, which is further illustrated below.
The above 4 sample images can be recognized with the first 3 procedures incorporated in our program. The image is scanned to detect any regular and irregular shapes associated with typical splashes of blood. Then the association with skin is applied for the foot and hands for the two samples that fall short by filtering only colors and shapes.
Filtered images are produced without color as shown below as an interim step for further examination by our programs.
However, false positives are hard to avoid for the 3 samples shown below. Thus procedures (4) and (5) are necessary for our programs to work.
These false positives are eliminated by the larger-than-usual size, even when colors can fit in the range of palettes that are created to depict blood.
More cases are explored to include:
(1) blood with no association of skin, as shown below on the left, or
(2) blood drops instead of perceived splashes, marks, or pools of blood collected in our database, as shown below on the right.
These cases are presenting further challenges for the 5 procedure approach we incorporated.
It is obvious, 5 heuristic algorithms with 10 shown distinctive blood or non-blood samples can barely scratch the surface of accurate recognition of violence. Our current recognition rate is around 70% with limited samples, not desirable enough.
Future work would have to be combined with other heuristic approach, perhaps jointly with the analysis of companion text. A restructure of the architecture of the recognition program is also underway in order to handle more samples of blood scenes.
Spam filter techniques are identified with its ever changing phases. Spam statistics containing violence has received increasing interest by major countries with explicit censorship policies, however no specific resolution tools are known. Our investigation explores image filtering techniques that identify contents with blood, a possible connection of violent actions in political and social environment. Improved imagine filtering techniques leading to the recognition of blood are tried to eliminate false positive cases. Results indicate needs of more heuristic procedures toward higher and broader degree of recognition. When heuristics might not suffice, building a database containing more blood samples and restructure of the recognition architecture and logic in our current work will become critical. It is also anticipated that this study of blood recognition can have further applications beyond detection of violent spam.
Energy Consumption Metrics for Plug-in Consumer Electronics on the Smart Grid – May 5, 2010
Abstract— energy consumption improvement trend analysis for IC and products to determine the goals of consumer electronics power reduction requirements based on composite measurements.
Introduction - The 2009 United Nations Climate Change Conference, commonly known as the Copenhagen Summit pointed out squarely the major responsibility of 3 countries in achieving a cleaner earth with carbon dioxide emission reduction of 25% to 40% of the 1990 levels by the year 2020. As illustrated in Figure 1, China and the USA clearly bears the most duty in carbon dioxide emission control.
Worldwide, 2005 had the highest level of GreenHouse Gas (GHG) emission. Until 2006, the US was the largest emitter of carbon dioxide emissions. China has been the top emitter since 2006. President George W. Bush’s goal was to reduce GHG intensity (the amount of GHG emissions per unit of Gross Domestic Product) by 18% from 2002 to 2012. In fact, the actual GHG emissions are projected to increase by 11% during that period. In 2006, US GHG emissions decreased 1.5% from 2005 to 7,075.6 teragrams, and an increase of 1.4% from the 2000 levels of 6,978.4 teragrams. By 2012, GHG emissions are projected to increase to more than 7,709 teragrams of carbon dioxide equivalent, which will be 26% above 1990 levels.
President Barack Obama is offering a U.S. target for reducing GHG emissions in the range of 17% below 2005 levels by 2020. The proposed target agrees with the limit set by climate legislation that has passed the U.S. House of Representatives, but the U.S. Senate is currently considering a bill that cuts GHG emissions further to reach 20% below 2005 levels by 2020. The White House noted that the final U.S. emissions target would ultimately fall in line with the climate legislation. In light of the President’s goal for an 83% reduction in GHG emissions by 2050, the pending legislation also includes a reduction in GHG emissions to 30% below 2005 levels by 2025 and to 42% below 2005 levels by 2030. The US Government shows a much more serious position than in 1990 with the Kyoto Summit.
The day after the White House announced the U.S. GHG targets, China announced that it will reduce the intensity of its carbon dioxide emissions by 40%-45% by 2020. Carbon dioxide emissions intensity is defined as the amount of carbon dioxide emissions per unit of Gross Domestic Product (GDP).
What is a reasonable level for the various contributors of carbon dioxide to comply? What is a reasonable effort deemed necessary for developed countries, e.g., the USA, both domestically and internationally? What can research, development, entrepreneurs, consumer, State and Federal regulations foster and translate from the local to the national level, or vice versa? Answers to these questions are attempted by this paper from the consumer’s perspective.
Trends of Reduction in CE Power Consumption
- How to measure power consumption
Three areas of interest are explored in this paper:
(a) Entertainment big ticket items, e.g., TV
(b) Home Electronics: computer and communication devices
(c) IC components
These areas are elaborated below.
(a) TV Power Consumption
HDTV Power Consumption compared 107 TVs’ power use from Jan 3, 2008 to April 19, 2010[4.] The highest watts consumption showed 575.56 watts (Panasonic TH-65VX100U) @ 0.23 watt/in2 vs. the lowest 71.683 w (Sharp LC-32D47UT) @ 0.109 w/in2 – a factor of 8 in difference. Cost per year to the consumer showed the highest $126.17 (Panasonic TH-65VX100U ) vs. the lowest $15.84 (Sharp LC-32D47UT) – also a factor of 8 in difference. The calibrated value showed the highest watts consumption 489.38 w (Panasonic TH-58PZ750U ) and the lowest 46.909 w (Westinghouse SK-32H640G) – a factor of 10 in difference. Calibrated cost per year showed the highest $108.07 (Panasonic TH-58PZ750U) vs. the lowest $10.59 (Sharp LC-32D47UT) – also a factor of 10 in difference.
Besides the raw power measurements, many user behavior changes can save TV power consumption: Examples are:
- Turn off the TV and other connected devices when not being used
- Turn off the Quick Start option
- Turn down the LCD’s backlight
- Turn on the power saver mode
- Reduce light output with other settings
- Control room lighting
- Buy a smaller screen
- Watch TV together
- Watch less TV
The control is in the consumer’s hand and significant savings can be achieved simply by exercising item (9) vigorously.
Thus, effective savings should be represented by the sum of:
Reduction of Power Consumption [by Brand manufacturer] + [by Consumer Behavior]
50% or more savings can be expected by user behavior change through consumer education.
(b) Home Electronics Power Consumption Improvement
Energy Star Standards recommend 20% ~ 30% improvement from legacy systems. Energy Star qualified TVs use 30% less energy than those listed above. By 2008, a wide range of Energy Star qualified TVs became available. Many qualified home electronics including PC, cordless phones, cell phones, battery chargers, DVD players and external power adapters became available with 90% less energy consumption than their previous vintage. A specific example can be demonstrated by a Senior Care electronic product which incorporated new stringent power saving mechanisms to last 1 year without battery charge, as compared to a week’s operation of a typical cell phone in standby mode.
(c) IC Components Power Consumption Reduction
i. Asynchronous Circuits Power Consumption Improvements
Power reduction is one of the main reasons for designing asynchronous circuits. Asynchronous circuits are inherently capable of having low power consumption. More methods have been identified to achieve even lower power consumption. For example:
- Elimination of input-validity-check
- Elimination of unnecessary handshakes
- Shared bus instead of Input controller
- Integrating back-to-back circuits
- Integrating a state variable by loop
These methods when applied to some sample circuits have shown power reduction of 20% to 41%, which constitute a comprehensive investigation of reducing the power consumption. The overall result shows great improvement by the effort to reduce transistors and transitions necessary in the design of various functions.
ii. Energy Efficient Ethernet (EEE)
Ethernet cable supplies power up to 51 W over a single cable by utilizing all 4 pairs in the Cat.5 cable.
Ethernet 100BASE-TX, 1000BASE-T PHYs and the emerging 10GBASE-T are being worked on under the Energy Efficient Ethernet Draft Standard (IEEE 802.3az) for methods to reduce power consumption. The draft Standard specifies 3 major efforts to reduce power consumption in Ethernet networks:
- Comprehensive control policy
- Additional energy savings: Putting additional resources “to sleep” beyond the PHY by taking advantage of the EEE low power state
- Software utilization in maximizing energy savings in various applications and spaces, allowing for optimized energy savings and a control policy that can be customized.
The IEEE 802.3af-2003 Power over Ethernet (PoE) standard (ratified June, 2003) provides up to 15.4 W of DC power (minimum 44 V DC and 350 mA) to each device. Only 12.95 W is assured to be available at the powered device as some power is dissipated in the cable.
The IEEE 802.3at-2009 PoE standard, sometimes called “POE+”, (ratified September 11, 2009), provides up to 25.5 W of power. Some vendors have announced products that claim to comply with the new 802.3at standard and offer up to 51 W of power over a single cable by utilizing all 4 pairs in the Cat.5 cable. Numerous non-standard schemes had been used prior to PoE standardization to provide power over Ethernet cabling.
Broadcom has pointed out computer , on the average, only uses 1% of the Ethernet link for information transfer. This leads to the possibility of zero energy consumption as set forth by this paper. A few other chip supplier’s efforts are described as follows.
iii. Trends of Processor Power Reduction
Intel Core Duo processor balances dual-core computing capabilities with power savings that enable improved battery life in notebooks. Its enhanced voltage efficiency supports cooler and quieter system designs.
Power saving characteristics are described below:
- Intel® Dynamic Power Coordination
With enhanced low power management, the Intel Core Duo processor delivers coordinated dual-core performance “on demand”. Intel® Dynamic Power Coordination allows individual cores to dynamically transition to Halt, Stop Clock, and Deep Sleep power management states, in addition to enabling dual-core coordinated platform Deeper and Enhanced DeeperSleep transitions. The shared Power Management Logic coordinates Enhanced Intel SpeedStep® and idle power management state (C-state) transitions in hardware to manage voltage and frequency more efficiently.
The Intel Core Duo processor can operate at very low voltages and uses advanced techniques to minimize clock and signal switching, resulting in low power dissipation in the active state. Featuring new low frequency mode power management states, the Intel Core Duo processor enters and exits from these states more quickly, providing fast response and significant power savings.
The Intel Core Duo processor also features Dynamic Bus Parking that allows the chipset to power down with the processor in these low-frequency mode states, delivering platform power savings.
- Enhanced Intel® Deeper Sleep with Dynamic Cache Sizing
This new power savings mechanism flushes system memory dynamically, based on demand or during periods of inactivity. Power savings occur as the cache ways are turned off once the data has been saved in memory. Because L2 Cache data integrity determines Deeper Sleep minimum voltage limits for the Intel Core Duo processor, once the Dynamic Cache Sizing feature flushes the entire Level-2 cache to memory, processor transitions to a new power management state. This is called Enhanced Intel Deeper Sleep, and allows the processor to lower voltage below the Deeper Sleep minimum voltage for enhanced power savings and/or efficiencies.
- Intel® Advanced Thermal Manager
The Intel Core Duo processor features a new thermal management system that delivers enhanced accuracy and more precise acoustic control. A new digital temperature sensor and thermal monitor on each individual core is located close to hot spots, enhancing accuracy at higher temperatures and enabling more precise fan control. The processor alsosupports the next generation dual-core optimized voltage regulator, Intel® Mobile Voltage Positioning (Intel® MVP VI) and includes the legacy thermal diode in the shared area as a fail-safe mechanism.
Power requirements for the Intel Core Duo processor are reduced to 65 W.
The latest powerful XEON server processor has a model that has the power consumption down to 40W.
Further energy reduction is expected from system integrator via software.
- Texas Instrument
The MSP430 is a microcontroller family from Texas Instruments. Built around a 16-bit CPU, the MSP430 is designed for, specifically, low power consumption embedded applications. The MSP430 is particularly well suited for metering, wireless radio frequency engineering, or battery-powered applications.
The power used by MSP430 can be estimated with the typical 200 mA·Hr capacity of a CR2032 lithium coin cell as 22.8 μA·year. Thus, considering only the CPU draw, such a battery could supply a 0.7 μA current draw for 32 years of the MSP430. Although realistically battery discharge would reduce this number, it still represents many orders of magnitude reduced energy use among many other ICs.
We propose an established metric illustrated by Figure 2. It is based on a composite improvement of hardware, software and human factors. It is suggested of a 3-prone adjustment:
- Contribution of consumers – 50%
- Contribution from hardware and software improvements – 40%
- Contribution from new CE devices to breed energy – 10%.
Item 1 is a low-hanging fruit which can be accomplished by consumer education. Item 2 is on-going and efforts are well underway championed by socially conscientious manufacturers and the winner could end up with a large market share. Item 3 will lead consumer to a new frontier of quality of life taking alternative energy sources and changed behavior.
We propose a metrics to reach a 1% or near zero energy consumption 10 years from now. Following Moore’s Law, power consumption could be reduced by a factor of 25 in a period of 10 years. Thus, those CE not meeting the 1% or zero energy consumption in 10 years should be phased out of the market. This proposal takes into account Smart Grid concept of alternative energy sources where energy can be fed back to the grid. Examples are energy generated via vibration, magnetic induction, and other alternative energy sources; many are on the horizon already. This coupled with user behavior change and abandonment of those CE below the goal will un-doubtfully bring down energy consumption un-precedently.
The metrics, if adopted industry-wide, could reverse the current awkward rejection of the Copehagen Agreement by the most pollution contributors in the world.
Copenhagen Climate Change Summit has brought renewed responsibility to countries in reducing carbon dioxide footprint and demanding energy savings in every aspect of our lives. Standards and Regulations promoted within the US have driven energy efficient electronics design since the early 1990s to reduce energy consumption and energy waste of consumer electronics by a factor of 40% up to 90%. Such reduction is deemed insufficient with the exponential usage increase of consumer electronics. The translation of Energy Star Standards to the latest Copenhagen Agreement on Carbon Dioxide footprint reduction is not straight-forward. Without a clear engineering metrics, efforts toward energy savings are hard to call it satisfactory. After investigation of trends driven by Energy Star policies, it is recommended that a zero level energy consumption of consumer electronics be achieved by the year 2020. Combined with consumer behavior change, it is conceivable that we can not only reduce power usage drastically, but breed energy by employing innovative technology using motion generated energy and other alternative energy sources. Such technology innovation is in sight to make the proposed metrics a reality.
Smart Broadband Service Architecture – May 2, 2010
Abstract – We have designed service implementation of Broadband Internet through smart meters at 10 Mbps for California Inland Empire area of about 7,000 miles2, population 4 million. Multi-dimensional technology features are planned to equip this economically distressed area with computer based knowledge acquisition and job training tools from home. While building out via the conventional Broadband Internet infrastructure, we also take advantage of the smart grid installation by Southern California Edison (SCE.) With the area rich solar resource, the benefit of Internet service via smart meters can provide attractive income potential for the distressed family, in addition to access of monthly payment cost control and contribution to electricity load balancing in the SCE grand scheme.
Thus, extending smart grid protocol interworking for home networking can lay the corner stone in our Broadband Service deployment to meet the present need and the future advanced services including emergency response and telemedicine.
Our targeted Broadband Internet deployment is for the Inland Empire Communities in southern California, where Riverside and San Bernardino Counties are among the hardest hit in terms of housing foreclosure and un-employment. It is also identified that these counties under severe financial burden are also qualified as underserved and un-served areas for Broadband Internet access > 5Mbps. As these areas clearly require renovation, it can be a green field for un-conventional deployment of proven technology components. We realize home networking technology coupled with smart meter application in a solar rich area can equip the low-income housing residents with computer based knowledge acquisition and job training skills while acquiring extra benefit of utility bill management. We identify specific protocol utilization to fulfill this need.
On the social front, it is expected that this broadband service with clearly targeted Utility applications could jump-start low-income housing residents in their uphill financial recovery. Planned start with conventional Broadband Internet infrastructure, we take advantage of smart meters installed in houses by Southern California Edison to provide alternative Internet access, eventually adding broadband video and imaging capability for residential healthcare and emergency response services.
Area Served – HouseHolds & Businesses Passed – and Services
a) Areas of Interest
Riverside County and San Bernardino County in the Inland Empire Communities consist of 668,905 HouseHolds (HH) and 51,755 Businesses Passed (BP). Our focus is low-income housing which has recently reached 50% in the area due to the latest economic recession.
The topology lays out a network in Riverside County overlaying with some geographical and demographical factors. Middle Mile Hubs have been identified in Hemet and Indio of the Riverside County, in Hesperia and Victorville of the San Bernardino County.
Southern California Edison is implementing Smart Meters regardless of the economic background of individual houses.
Targeting low-income housing first, we take advantage of SCE’s deployment plan to allow the poor housing environment with effective tools to acquire Internet knowledge and solar energy resources simultaneously!
b) Service Migration
(1) Value added service provisioning will be introduced as follows:
- smart grid access (Phase 1)
- First Response Unit access (Phase 2)
- Remote access to healthcare (Phase 3)
It is hoped to turn the “house of despair” to “house of hope” with tools utilizing broadband communication infrastructure and platform of protocol conversion.
(2) Smart meters will be accessed and the proper protocols will be implemented to inter-work with WiFi/WiMax interconnection access point between the housing complex and the local utility access point.
(3) For areas that require distance connectivity beyond the local area access, Middle-mile broadband access will be deployed from 4 hubs initially, with 2 in Riverside and 2 in San Bernardino County. This will be accomplished via WiMax radios and interconnected with fiber facilities as the access bandwidth is expanded from mega- to giga-bits per second down the road.
(4) The local low-income housing renovation can be coupled with the fulfillment of utilities, with affordable wiring and computer applications to demonstrate and train for utilization of these advanced services to further include emergency service dispatch intelligence and hospital access.
(5) Healthcare data standards and network transparency will be implemented through software for secured control and health monitoring purposes to assure connectivity between the housing complex and the appropriate medical facilities in the area. HIPPA compliance plays a key role in this connectivity.
(6) Upon the successful deployment of these services, retrofitting to other well-off residents of the same benefits would be a no-brainer on technological and financial applicability. The focus is then turned into higher speed than 10Mbps Internet.
c) Market Approach
A sustainable Internet infrastructure will be laid in the remote areas (relative to Los Angeles,) among Inland Empire communities, to start with the low-income housing and to migrate to general housing of higher Internet speed for the entire area. To enrich housing resident toward financial independence, this deployment facilitates Internet tools in learning and job seeking.
Instead of seeking the most lucrative customers at the onset, the ARRA-Broadband funding directive is to stress-test proven technologies synergistically to yield the highest benefit to the society at large.
Phased, incremental expansion of the existing Broadband Internet standards will be incorporated, linking smart grid access, to emergency response service access and tele-healthcare. Mega to Giga-bps bandwidth will be installed via a mixture of wireless and wired network infrastructures. This infrastructure will start with the availability of smart grid access to allow low-income housing of energy savings in utility bills. Simultaneously building the conventional Broadband Internet, the mixed use of broadband pipelines will be optimized under high and low utilization of the bandwidth to suit for an economical delivery of video and medical/weather imaging down the road.
e) Network technology in the Grand Scheme
DSL, WiFi, WiMax, microwave and Fiber, on demand, will be deployed to support the low-income housing renovation. Compliance following Smart Grid IEEE P2030 and Healthcare HIPAA and HITECH protocols will be pursued as these standards become mature. Most importantly, an IP message protocol will be implemented for transparency to overcome diversification of the lower layer protocols and hardware inherent to our mixed deployment of technologies.
Network and Service Design
The Broadband Internet Service, trade-named Rocket Broadband is revealed in Figure 2, starting with an architectural visualization of a proven smart grid network deployment by the City of Anaheim, to identify the service module access point. Four service modules are designed:
- Rocket 10M+ – WiFi/WiMax MIMO compatibility
- Rocket Smart – Utility Smart Grid towards IEEE P2030 compatibility
- Rocket Health – HIPAA & HITECH compliant
- Rocket First – Public Safety Network/Database interface compliant.
a) Smart Grid and Home Network
A smart grid network is described below.
The system supports 2 types of customers: residential customers and commercial/industrial customers. The residential customers (shown on the upper left corner in Figure 2) have a smart meter installed in their home by the Utility. It reads and converts the data through a transceiver to run on a 220 MHz WAN, where the frequency is raised from 200 MHz in the head-end to 900 MHz for communicating between the base and the smart meters. Using TCP/IP protocol, customer usage data will be sent to the Utility’s Meter Data Management & Storage (MDMS) System. The private wireless network (WAN) run by the Utility provides a cost-effective communication means to allow 2-way communications. There are other alternatives, what we described above is a solid network in use.
The business and industrial customers (shown on the lower left corner in Figure 2) can access with smart grid sensor devices and solution tools which are certified on public carriers’ wireless data network. The cost base is typically less than a dime per business and industrial customer which is considered relatively affordable. Data can be viewed through the business customer’s mobiles. The usage data go to a Network Operations Center (NOC) using IP protocol, transmitted to the Trouble Management System (TMS) server and stored in the MDMS. Distinctive firewalls with increasing security are placed between the NOC and TMS, and between the TMS and MDMS, as data move toward the Center of Operation with staff. An additional firewall is erected within the MDMS. Overall, there are 3 layers of firewalls. Security is top priority.
The MDMS is integrated with customer billing system. It collect s the automated meter reads and eventually connect to a web portal for customers to view their water and energy consumptions. The Outage Management System (OMS), a tool that remotely detects the exact location, will also be integrated with the MDMS.
Although a secondary priority in customer pursuit, the business/industrial customer entry point will be used for the establishment of service modules of Rocket Broadband 10M+, and Rocket Smart. These 2 service modules are plugged in as 2 industrial customers on the Utility’s smart grid network.
The smart meters feature sensors throughout the transmission and distribution grid to collect data, real-time two-way communications to move that data and electricity between Utility/Rocket Broadband modules and consumers. The smart meters are equipped with the computing power necessary to make that intelligence actionable.
Distributed Energy Resources (DER) will allow consumers to obtain credits of their own generation of solar energy. (It could be wind or other forms of alternative energy sources.) Data storage devices are installed at customer premises to connect with the grid and open two-way flows of electricity and communications. DER migration will be critical in determine our network economics.
Understanding advanced control methods used by the Utility allow us to provide the most sophisticated and reliable Internet service. These advanced control methods can monitor power system components, enabling rapid diagnosis and timely, appropriate responses to any event above and beyond our existing NOC capability. They also support market pricing, enhance asset data management and efficient operations, and involve a broad application of computer-based algorithms. In detail, they cover:
• Data collection and monitoring of all essential grid components
• Data analysis to diagnose and provide solutions from both deterministic and predictive perspectives.
Sensing and measurement technologies enhance power system measurements and facilitate the transformation of data into information to evaluate the health of equipment, support advanced protective relaying, enable consumer choice and help relieve congestion. These include:
• Smart meters
• Ubiquitous system operating parameters
• Asset condition monitors
• Wide-area monitoring systems (WAMS)
• Advanced system protection
• Dynamic rating of transmission lines
Advanced metering infrastructure measures and records usage data at a minimum, in hourly intervals, and provides usage data to both consumers. Rocket Broadband Service modules will be fortified by the demanding Smart Grid network standards and requirements.
IEEE Standards P2030 – Smart Grid Project – has its goal to “permit two way power flow with communication and control to promote a more reliable and flexible electric power system.” IEEE P2030 is considered as a faster unification of meter, sensor and event data that offer more choice of peak curtailment notification services and power rationing or prioritizing in shortage or outage situations. In other words, it provides resilient community capabilities. These advanced features will position Rocket Broadband service well into future essential applications. This is deemed a luxurious synergy between what we can build on our own versus the approach of using smart grid as THE Internet.
Because the communication standards P2030 assumes vastly more bandwidth than is required for power control, borrowing P2030 network characteristics absolutely wins with its inherently high reliability and security requirements. Services can easily be latched on for burglary, fire, medical and environmental sensors and alarms, ULC and CCTV monitoring, access control and keying systems, intercoms and secure phone line services running over AC power lines. Services such as radio, TV and general Internet could readily run over the same infrastructure as their security needs are just starting to catch up with those for power or alarm systems. However, standardizing the tariffs or interfaces for non-power-control applications has not been a function of P2030 itself. Our service deployment would be contingent on private negotiation with Southern California Edison. However, we see no major issue as we participate in the smart grid as an industry customer. Although only those that continue to draw AC power directly would participate in controls as P2030 is currently defined, we would result to our conventional DSL and Broadband Wireless implementation with explanations in the next few Sections, if IEEE P2030 standards trail our needs.
b) HomePlug Protocol Landscape
The HomePlug is another relevant protocol for data communication and smart grid communication at home. With its formation in 2000, utilization of the HomePlug network protocol has had more than 170 products certified  at an affordable cost which has reached millions of users in the last few years. On the one end, the amount of data transmission for control of lighting, appliances, climate control, security monitoring and other devices requires narrow band information; on the other end, the support of HDTV and VoIP requires bandwidth well above 10 Mbps for resolution and synchronization quality. IEEE P1901 Standards used by HomePlug incrementally incorporate physical layer support of these diversified applications driving up to gigabit range, extending to HomePlug AV2.
c) Networking Security
Security is absolutely essential in smart grid, thus it is highlighted below, for extreme caution in deployment to cover a financially distressed area. First, the goals for meter data security are described:
• Identifying critical sites and systems
• Protecting selected sites using surveillance and barriers against physical attack
• Protecting systems against cyber attack using information denial (masking)
• Dispersing sites that are high-value targets
• Tolerating disruptions
• Integrating distributed energy sources and using automated distribution to speed up recovery from any attack.
I. Cyber Security Manifestation
There are 3 Firewalls in the Utility network.
Wireless Network for Supervisor Control and Data Acquisition (SCADA) is protected with an ID and DNP Address
- Substation Automation Systems are password protected
- The Utility is continuing to comply with North America Electrical Reliability Council (NERC) Protection Rules for Power systems
The diversified applications impose some challenges to our integration. Case in point, Security Monitoring receives status report from sensors and camaras in telemetry, as well as camera images in broadband speed. In-between bandwidth is sorted of un-needed. It becomes important for bandwidth optimization software to custom taylor multiple reports intelligently while saving energy at the same time. Furthermore, security of the low-income housing Internet requires extra attention and HomePlug protocol by far has the most inherent network security mechanism built in from the on-set, comparing with other earlier stage of home network protocols where non-harmful environments were assumed.
Thus, starting with the HomePlug protocol for within the house connectivity is based on very legitimate rationale for the areas identified. Moving outward to assure connectivity from a home router to the curb onto the access point into our overall network require further interworking between WiFi and WiMax as explained below.
d) HomePlug connecting to WiFi and WiMax
Our parallel path, besides the Smart Grid, is in the use of HomePlug, as an in-home wired Broadband technology, which essentially connects, via a gateway, to an access network that covers the last mile or last 100 meters around the house and communities from a backhaul network. The access networks include fiber optics/cable, DSL, WiFi[11,] and WiMAX[12,] as we would decide on a housing complex by complex basis. In addition, where the HomePlug signals do not reach around a house, either due to the lack of power line or the impairment of the power lines, a wireless signal around the house can cover the “dead corners”. The wireless protocol of our choice is WiFi, versions of 802.11g/n with Multiple Input and Multiple Output (MIMO,) which has been deployed in large scale and proves to be effective, economical, and reliable. We propose that WiFi covers last 100 meters around the houses and communities. We would deploy a few WiFi access points (interim base stations) that are connected to backhaul networks either through DSL, Fiber Optic/Cable or WiMAX (to be discussed later) ; these WiFi access points communicate with WiFi customer premises devices (or the WiFi user terminals), which are modules installed in a computer or electronic device.
WiMax is a relatively new technology that can reach longer distance and carry more data throughput, at a relative higher cost. It is cost effective and evolutionary technology to cover the distance between communities and the backhaul network, where the broadband alternatives are not available.
WiFi and WiMax interworking is accommodated in Standards’ versions that support MIMO features.
Protocol Conversions and Network Issues
As we discussed above, there are many technologies and network types that can be contemplated. For the access network, there are wired communication protocols such as cable, DSL, fiber optics, and wireless protocols, e.g.,WiFi, WiMAX, and 3G/4G cellular communications. For the in-house networks, there are broadband technologies: e.g., HomePlug[4,] MoCA[9,] and WiFi. There are various narrow band protocols targeting sensors and smart meters, e.g., Zigbee, ISA 100, and WirelessHART[10.] Furthermore, there are various other ISM based communication protocols for private and de facto applications.
These technologies all have advantages and disadvantages for migration considerations, and they are expected to coexist within our network deployment and operations timeframe. Unfortunately, making them interoperable is not always easy.
The Rocket Broadband Services take advantage of smart grid communication and smart home healthcare communication systems which will be transparent to all these “low-level” PHY and MAC protocols considered in P2030, WiFi, and WiMax. We pick and choose among reliable, inexpensive, and real-time communication devices from the market. We build a virtual network environment on top of a heterogeneous network of networks and devices.
A case in point: A Zigbee based network measuring electric meters and other devices (water meters and gas meters in addition,) connects with a gateway device, which also connects to a proprietary ISM based narrowband network that monitors a medical device as part of a home healthcare system. Furthermore, the gateway device also connects with the HomePlug network to send data to a computer situated upstairs in a house. The gateway device communicates with a WiFi Access Point nearby, which is connected with the Internet. The gateway device converts the non-IP packets to IP packets and vice-versa. The Rocket Service Control Center a hundred miles away can only communicate with these field sensors easily if it were the IP gateway.
We see the need of an IP based Smart Grid messaging system with security support that takes advantage of the ubiquitous Internet Protocol. This is by far the most suitable mechanism as a catalyst to converge all communication technologies. Although some Standards Committees have recognized it, there has not been visible progress in constructing IP message sets for smart grid, let alone emergency response or healthcare support by communications. We expect that standards defining the Smart-Grid message set be used by all lower level communication protocols, including extension from the IP protocol itself. Only through Standards would it facilitate the conversion of the packet level communications from non-IP packets to IP packets for across the board applications.
The Smart-Grid Message Set as we envision for our Rocket Smart Service module  would have profiles to take care of either very low power, battery driven and inexpensive device or more powerful and plug-to-the-wall device. Furthermore, the Smart-Grid message set should be symmetric to allow energy payback to the Smart-Grid system.
In the deployment of new DSL, Broadband Wireless, fiber, or to tap into the on-going Smart Grid network, Physical and MAC Layer protocols are not the issue. Our IP based Messaging Set can be transparent to whatever the PHY and MAC layers maybe underneath, thus allow flexibility and economics of scale in implementation, such is the ultimate pursuit of our Rocket Broadband Services.
The Rocket Broadband Services take advantage of the on-going build-out of Smart Grid network in addition to the on-demand implementation of Broadband Wireless gears, the conventional DSL, Cable and Fiber networks for a first application in the distressed housing access to Internet. A synergistic offer of Internet services is planned with the Utility Smart Meter service for the short run; emergency response, telemedicine, and many other emerging services for the long run through IP Messaging Set protocol.
Critical System Resolution for High Definition Satellite Set-Top Box Deployment (excerpt) – November 15, 2008
This paper describes the system debugging experience associated with one of the first deployment of satellite video broadcasting High Definition Set-Top Boxes (HD STB)in the United States. Since the hardware is not easily changeable, the system investigation sorts any customer impacting bugs into the category of application software, set-top-box driver software, and processor driver software. The critical software debugging effort is described and the root-cause discussed with some detailed examples. Multi-layer and multi-organization development is practiced to assure product quality and time to market.
Satellite Video Broadcasting pioneers the digital conversion in video/audio/data stream transmission and reception. The signal processing functions include receiving, de-multiplexing, decoding and displaying onto televisions with analog or digital inputs, standard or high definition resolutions. These functions are implemented in the High Definition Set-top Box (HD STB) which was introduced to the market in 2007. This paper describes the software debugging experience associated with the initial deployment of the HD STB. As each generation of STB hardware is targeted for around 1 year, any upgrades within a year would have to be dependent on the software capability.
The software stack implemented in an HD STB involves 4 layers:
- Video Processor Driver Layer
- STB Control Driver Layer
- Middleware Layer
- User Interface Layer
Different organizations are involved in the development of these 4 layers. The work is further integrated to process functions as reflected in the HD STB architecture.
The hooks that link various layers consist of timing, metaphor, and other buffer boundary conditions to deal with corner cases of video delivery issues. The goal is to present the top quality video and audio to consumers. A lot of effort went into debugging of rare occurrence because of the stringent quality standard set by the customers. These products finally reached stabilization in early 2008 to the satisfaction of the service provider and to the delight of ultimate consumers. While efforts of improvement in quality continue, this paper describes some of the more challenging software debugging experience associated with one of the first HD STB deployments in the United States.
Major Issues Associated with Software Implementation and Integration
Critical and challenging issues are resolved with intense root-cause analysis followed by software modifications and integration involving many diverse organizations. A few highlights include:
- Channel Change Timing
- Loss of Video and Audio
- Signal Strength Optimization
- Guided Setup to Programs
Satellite HD STB development has to involve multi-layer multi-vendor software development process with close working relationship and frequent communications. The traditional, structured development time frame has been shortened significantly. The reality is that field maintenance support has become crucial to assure system stabilization. Whether this defies the traditional practice is not as important as finding practical solutions to meet customer’s demand in time to market. Introduction of these feature rich entertainment equipment has, thus, to rely on multiple organization cooperation and balancing of quality and customer demands, much of it happened after the product is deployed.