Mobicom. Day 3
3rd and final day... mainly about PHY/MAC layer and theory works
The day started with a Keynote by Farnan Jahanian (University of Michigan, NSF). Â Jahanian talked about some opportunities behind cloud computing research. In his opinion, cloud computing can enable new solutions in fields such as health-care and also environmental issues. As an example, it can help to enforce a greener and more sustainable world and to predict natural disasters (e.g. the recent japanese tsunami) with the suport of a wider sensor network. His talk concluded with a discussion about some of the challenges regarding computer science research in the US (which seem to be endemic in other countries). He highlighted that despite the fact that the market demands more computer science graduates, few students are joining related programs at every level, including high school.
Session 7. MAC/PHY Advances.
No Time to Countdown: Migrating Backoff to the Frequency Domain, Souvik Sen and Romit Roy Choudhury (Duke University, USA); and Srihari Nelakuditi (University of South Carolina, USA)
Conventional WiFi networks perform channel contention in time domain. Such approach imposes a high channel wastage due to time back-off. Back2F is a new way of enabling channel contention in the frequency domain by considering OFDM subcarriers as randomised integer numbers (e.g. instead of picking up a randomised backoff length, they choose a randomly chosen subcarrier). This technique requires incorporating an additional listening antenna to allow WiFi APs to learn about the backoff value chosen by nearby access points and decide if their value is the smallest among all others generated by close-proximity APs. This knowledge is used individually by each AP to schedule transmissions after every round of contention. Nevertheless, by incorporating a second round of contention, the APs colliding in the first one will be able to compete again in addition to a few more APs. The performance evaluation was done on a real environment. The results show that the collision probability decreases considerable with Back2F with two contention rounds. Real time traffic such as Skype experiences a throughput gain but Back2F is more sensitive to channel fluctuation.
Harnessing Frequency Diversity in Multicarrier Wireless Networks, Apurv Bhartia, Yi-Chao Chen, Swati Rallapalli, and Lili Qiu (University of Texas at Austin, USA)
Wireless multicarrier communication systems are based on spreading data over multiple subcarriers but SNR varies in each subcarrier. In this presentation, the authors propose a join integration of three solutions to reduce the side-effects:
- Map symbols to subcarriers according to their importance.
- Effectively recover partially corrupted FEC groups and facilitate FEC decoding.
- MAC-layer FEC to offer different degrees of protection to the symbols according to their error rates at the PHY layer
Their simulation and testbed results corroborate that a joint combination of all those techniques can increase the throughput in the order of 1.6x to 6.6x.
Beamforming on Mobile Devices: A first Study, Hang Yu, Lin Zhong, Ashutosh Sabharwal, David Kao (Rice University, USA)
Wireless links present two invariants: spectrum is scarce while hardware is cheap. The fundamental waste in cellular base stations is because of the antenna design. Lin Zhong proposed passive directional antennas to minimize this issue. They used directional antennas to generate a very narrow beam with a larger spatial coverage. They have proved that this solution is practical despite small form factor of smartphone's antenna, resistent to nodes rotation (only 2-3 dB lost if compared to a static node), and does not affect the battery life of the handsets, specially in the uplink as the antenna's beam is narrower. This technique allows calculating the optimal number of antennas for efficiency. The system was evaluated both indoors and outdoors in stationary/mobile scenarios.  The results show that it is possible to save a lot of power in the client by bringing down the power consumption as the number of antennas increases with this technique.
SESSION 8. Physical Layer
FlexCast: Graceful Wireless Video Streaming, S T Aditya and Sachin Katti (Stanford University, USA)
This is a scheme to adapt video streaming to wireless communications. Mobile video traffic is growing exponentially and users' experience is very poor because of channel conditions. MPEG-4 estimates the quality over long timescales but channel conditions change rapidly thus it has an impact on the video quality. However, current video codecs are not equipped to handle such variations since they exhibit an all or nothing behavior. They propose that quality is proportional to instantaneous wireless quality, so a receiver can reconstruct a video encoded at a constant bit rate by taking into account information about the instantaneous network quality.
A Cross-Layer Design for Scalable Mobile Video, Szymon Jakubczak and Dina Katabi (Massachusetts Institute of Technology, USA)
One of the best papers in Mobicom'11. Mobile video is limited by the bandwidth available in cellular networks, and lack of robustness to changing channel conditions. As a result, video quality must be adapted to the channel conditions of different receivers. They propose a cross-layer design for video that addresses both limitations. In their opinion the problem is that the compression an error protection convert real-valued pixels to bits and as a consequence, they destroy the numerical properties of original pixels. In analog TV this was not a problem since there is a linear relationship between the transmitted values and the pixels so a small perturbation in the channel was also transformed on a small perturbation on the pixel value (however, this was not efficient as this did not compress data).
SoftCast is as efficient as digital TV whilst also compressing data linearly (note that current compression schemes are not linear so this is why the numerical properties are lost). SoftCast transforms the video in the frequency domain with a transform called 3D DCT. In the frequency domain, most temporal and spatial frequencies are zeros so the compression sends only the non-zero frequencies. As it is a linear transform, the output presents the same properties. They ended the presentation with a demo that demonstrated the real gains of SoftCast compared to MPEG-4 when the SNR of the channel drops.
Practical, Real-time Full Duplex Wireless, Mayank Jain, Jung II Choi, Tae Min Kim, Dinesh Bharadia, Kanna Srinivasan, Philip Levis andSachin Katti (Stanford University, USA); Prasun Sinha (Ohio State University, USA); and Siddharth Seth (Stanford University, USA)
This paper presents a full duplex radio design using signal inversion (based on a balanced/unbalanced (Balun) transformer)and adaptive cancellation. The state of the art in RF full-duplex solutions is based on techniques such as antenna cancellation and they present several limitations (e.g. manual tuning, channel-dependent). This new design supports wideband and high power systems without imposing any limitation on bandwidth or power. The authors also presented a full duplex medium access control (MAC) design and they evaluated the system using a testbed of 5 prototype full duplex nodes. The results look promising so... now it's the time to re-design the protocol stack!
Session 9. Theory
Understanding Stateful vs Stateless Communication Strategies for Ad hoc Networks, Victoria Manfredi and Mark Crovella (Boston University, USA); and Jim Kurose (University of Massachusetts Amherst, USA)
There are many communication strategies depending on the network properties. This paper explores adapting forwarding strategies that decides when/what state communication strategy should be used based on network unpredictability and network connectivity. Three network properties (connectivity, unpredictability, and resource contention) determine when state is useful. Data state is information about data packets, it is valuable when network is not well-connected whilst control-state is preferred when the network is well connected. Their analytic results (based on simulations on Haggle traces and DieselNet) show that routing is the right strategy for control state, DTN forwarding for data-state (e.g. Haggle Cambridge traces) and packet forwarding for those which are in the data and control state simultaneously (e.g. Haggle Infocom traces).
Optimal Gateway Selection in Multi-domain Wireless Networks: A Potential Game Perspective, Yang Song, H. Y. Wong, and Kang-Won Lee (IBM Research, USA)
This paper tries to leverage a coalition of networks with multiple domains with heterogeneous groups. They consider a coalition network where multiple groups are interconnected via wireless links. Gateway nodes are designated by each domain to achieve a network-wide interoperability. Â The challenge is minimising the intra-domain cost and the sum of backbone cost. They used a game-perspective approach to solve this problem to analyse the equilibrium inefficiency. They consider that this solution can be also used in other applications such as power control, channel allocation, spectrum sharing or even content distribution.
Fundamental Relationship between Node Density and Delay in Wireless Ad Hoc Networks with Unreliable Links, Shizhen Zhao, Luoyi Fu, and Xinbing Wang (Shanghai JiaoTong University, China); and Qian Zhang (Hong Kong University of Science and Technology, China)
Maths, percolation theory ... quite complex to put into words
The San Diego Trip: An Overview of this year’s SIGKDD Conference
This year's SIGKDD conference returned after 12 years to San Diego, California to host the meeting of Data Mining and Knowledge Discovery experts from around the world. The elite of heavy-weight data scientists was hosted at the largest hotel of the West Coast and together with industry experts and government technologists enumerated more than 1100 attendees, a record number in the conference's history.
The gathering kicked off with tutorials and the parallel of two classics; David Blei's topic models and Jure Leskovec' extensive work on Social Media Analytics. Blei offered a refreshing talk that stretched, from the very basics of text-based learning, to the most up to date extensions of his work with applications in streaming data and the online version of the paradigm that allows one to scale up the model to huge datasets satisfying the requirements of modern data analysis. Leskovec elaborated on a large spectrum of his past work, covering a wide range of topics including the temporal dynamics of news articles, sentiment polarisation analysis in social networks and information diffusion in graphs by modelling the influence of participating nodes. The first day's menu on the social front was completed with Lada Adamic' presentation on the relationship between structure and content in social networks. Her talk at the Mining and Learning with Graphs Workshop provided an empirical analysis on a variety of online domains, that described how the flow of novel content in those systems was evident of variations in the patterns of interaction amongst individuals. The day closed with the conference's plenary open session that featured submission and reviewing highlights and the usual KDD award ceremonies: the latter session honoured the decision trees man, Ross Quilan, who presented a historical overview of his work and a data mining legion of 25 students from NTU that won this year's KDD cup on music recommendations.
After the second night of sleep and repetitive jetlag ignited wake ups, Monday rolled in and the conference opened with sessions on user classification and web user modelling. A follow up in the afternoon with the presentation of the (student) award winning work on the application of topic models for scientific article recommendation attracted the interest of many. The dedicated session of the conference on online social networks also signified the interest of the Data Mining community for the nowadays hot domain. The latter opened with an interesting work on predicting semantic annotations in location-based social networks and in particular the prediction of missing labels in venues that lacked user generated semantic information. While the machine learning part of the work was sound, its applicability as a real problem was doubted, suggesting the need to identify the essential challenges in a relatively new application area. Nonetheless, the keyword of the day was scalability: Â two talks focused on an ever classic machine learning problem, clustering, Â introduced in the context of the trendy Map Reduce model. Aline Ene from University of Illinois introduced the basics, whereas the brazilian Robson Cordeiro offered novel insights with a cutting edge algorithm for clustering huge graphs. The work driven by the guru Christos Faloutsos featured the elegance of simplicity with the virtues of effectiveness, showing that for some size does not matter and petabytes of data can be crunched in minutes. A poster session came to shut the curtains of another day. The crowd was not discouraged by the only-one-free drink offer of the conference organisers and a vibrant set of interactions took place. Some were discussing techniques, some were looking for new datasets, while social cliques were also forming in the corners of the hotel's huge Douglas Pavilion.
Day 3 drove the conference participants to the dark technical depths of the well established topic of matrix factorisation, that was succeeded by the user modelling session.Yahoo!'s Bee-Chung Chen gave an intriguing presentation on a user reputation in a comment rating environment, followed by the lucid talk of Panayiotis Tsaparas on the selection of a useful subset of reviews for amazon products that were plagued by tones of reviews. The Boston-based Greek gang of Microsoft Research, also showed how Mechanical Turk can be used to assess the effectiveness of review selection in such systems.  Poster session number 2 closed the day and the group's work on link-prediction in location-based social networks was up. The three hour exhaustive but fruitful interaction with location-based enthusiasts, agnostics and doubters was a good opportunity to get the vibe of the community in an up and coming hot topic. For application developers and online service providers the work was an excellent example of how location-based data could be used to drive personalised and geo-temporally aware content to users. For data mining geeks it presents an unexplored territory where existing techniques could be tested and novel ones devised. At the end of the poster session many of the participants headed for a taste of San Diego's downtown outing, whereas the relaxing boat trips at the local gulf were also highly preferred.
The final day of the conference was marked by Kaggle's visionary entrepreneur Jeremy Howard and a panel of experts in data mining competitions. The panel aimed to analyse the problems that were risen during previous competitions and the lessons learned for the creation of new successful ones. Howard presented radical views suggesting that the future of data mining and problem solving would be delivered in the form of competitions. Not only competitions could attract an army of approximately 10 million data analysts around the globe, but the design of them could promise a sustainable economic model that would bring money to all participants (even non-winners) and would perhaps put at stake a respectable number of PhD careers. His philosophy was driven by the idea that to solve challenging problems effectively, you need to awaken the diverse pool of minds that is out there and can constitute an infinite source of innovation.
But KDD attracted not only the interest of scientists and corporate experts, but also that of politicians. Ahead of 2012 elections the Obama data mining team is here and hiring! Rayid Ghani chief scientist at Obama for America highlighted the important role of predictive analytics and optimisation problems in the battle for an electorate body that is traditionally positioned to announce winners by only small margins of difference. It is left to see whether science will beat Tea Party style propaganda and will maximise positive votes in a bumpy and complex socio-political landscape. The political world was also also (quietly) represented by government data scientists and secret service analysts who were seeking to catch up with the state of the art in data mining and knowledge discovery, a vital survival requirement in a world overflowed with data and subsequent leaks...
The full proceedings of KDD 2011 can be found here.