till tomorrow, i'm at the web science summer school. i was invited to give a talk on privacy in mobile-social networking applications. my talk was a re-mix of blog posts and papers (including spotme, "what we geeks donâ€™t get about social media privacy", and "location-related privacy in geo-social networks" - pdf ). unfortunately i could not attend the whole summer school, but you can check here the schedule and my notes on a couple of talks are next.
marcel karnstedt gave a great presentation on the effects of user features on churn in social networks. he presented a nice empirical study of the mechanisms by which a web forum maintains a viable user base. he found that different forums show different behavioural patterns and also found few interesting regularities. have a go at his paper (pdf)
bernie hogan wondered what kind of mental models people have of their Facebook personal (ego) networks. to answer this question, he collected mental models that a number of Facebook users have about their personal networks, collected the actual personal networks from Facebook, clustered them using a community detection algorithm, and looked at the extent to which mental maps overlapped with actual networks. he found that people are good at identifying the clusters they are involved in but are not good at identifying which of their social contacts act as `brokers' in the network. this finding has interesting implications - eg, since opportunities/new ideas tend to come from brokers and people find it difficult to identify brokers, then it follows that people do not know where to look for new ideas, right? ;) bernie also said that neurotics tend to have broken networks, while extroverts tend to have clustered networks. check bernie's publications here!
the student projects look very interesting. they include collaborative filtering, sentiment analysis, and community detection.
Tracking and Saving Energy
Today, there was only a morning session in MobiSys about location tracking and energy efficiency. The first presentation was Energy-Efficient Positioning for Smartphones using Cell-ID Sequence Matching by J. Paek (Univ. of Southern California), K. Kim (Deutsche Telekom), J. Singh (Deutsche Telekom) and R. Govindan (Univ. of Southern California). This paper is about providing energy-efficient location techniques and seems to be an extension from a previous paper presented in MobiSys'10. They try to combine the complementary features of GPS and Cell-ID. GPS is more accurate than Cell ID but it is more energy costly. During their talk they showed the inaccuracy and inconsistency of network-based location on urban environments. It presents a mean error in the order of 300m and given a location, network-based location can report different locations. Their system uses cell-ID sequence matching along with history of cells and GPS coordinates and they also use time-of-day as a hint. It opportunistically builds the history of users' routes and the transitions between cells. After that, they use the Smith-Waterman Algorithm for sequence matching between similar historic data (they look for a sub-sequence in the database that matches and they pick up the sequence that matches the best and they turn ON GPS when there's no good matching). This approach can save more than 90% of the GPS energy since GPS usage goes down as learning progresses. The only limitation of the system is that it is not able to detect small detours but the authors mention that this is not a big issue.
Energy-efficient Trajectory Tracking for Mobile Devices by M. KjÃ¦rgaard (Aarhus Univ.), S. Bhattacharya (Univ. of Helsinki), H. Blunck (Aarhus Univ.), P. Nurmi (Univ. of Helsinki). It's possible to retrieve location based on WiFi, GPS or GSM and many location-aware services often require trajectory tracking. This paper proposes new sensor management strategies and it's built on top of a previous paper published in Mobisys'09 called EnTracked. The system minimizes the use of GPS by predicting the time to sleep before next position sensing (not clear to me what happens both in terms of energy consumption and usability if the sensor returns to the cold-start phase or if it has to remain in higher power modes since they use an energy model for that) using sensors such as radio, accelerometer and compass. The system requires the collaboration of a server. They also performed a comparative analysis with previous systems also presented in MobiSys. Comparatively, this study presents the lowest energy consumption across all the systems.
Profiling Resource Usage for Mobile Applications: a Cross-layer Approach by F. Qian (Univ. of Michigan), Z. Wang (Univ. of Michigan), A. Gerber (AT&T Labs), Z. Mao (Univ. of Michigan), S. Sen (AT&T Labs), O. Spatscheck (AT&T Labs). The idea is to provide a good understanding to developers about how their apps can impact on the energy consumption on mobile handsets because of using cellular networks. They are looking at the different power states of UMTS and the time required to move between states, and how an app can make the cellular interface to move between them. Their system collects packet traces, users' input and packet-process correspondence. They associate each state with a constant power value that was measured using a power meter (no signal strength is taken into account) and this is used to perform a detailed analysis of TCP/HTTP. They try to see how bursts incur energy overheads. They performed some studies based on popular apps and web services.
Self-Constructive, High-Rate System Energy Modeling for Battery-Powered Mobile Systems by M. Dong (Rice Univ.), L. Zhong (Rice Univ.). This paper aims to build a high-rate virtual power meter without requiring external tools. It's looking at the rate of how fast the battery consumption decreases. Classic Power Models are based on linear regression techniques and they usually external multimeter to be generated, they require a good hardware knowledge to build the power model and it is also hardware dependent. Those factors limit the accuracy of the model. Sesame is a self-constructive and personalized power model that looks only at battery interfaces and it uses statistical learning (Principal Component Analysis). They measured the error reported by the low-rate battery interface (non-gaussian) in order to increase its accuracy. The computational overhead for making the measurement might be very high but it's able to generate energy models at 100Hz. It takes 15 hours to generate the models to achieve an average error of 15%. It's more accurate than Power Tutor and other tools available in the market.
... and that's all from DC!