Together with the Urbanscope team, we gave a TedX talk on the topics and results of the project here at Politecnico di Milano. The talk was actually given by our junior researchers, as we wanted it to be a choral performance as opposed to the typical one-man show.
The message is that cities are not mere physical and organizational devices only: they are informational landscapes where places are shaped more by the streams of data and less by the traditional physical evidences. We devise tools and analysis for understanding these streams and the phenomena they represent, in order to understand better our cities.
Two layers coexist: a thick and dynamic layer of digital traces – the informational membrane – grows everyday on top of the material layer of the territory, the buildings and the infrastructures. The observation, the analysis and the representation of these two layers combined provides valuable insights on how the city is used and lived.
Urbanscope is a research laboratory where collection, organization, analysis, and visualization of cross domain geo-referenced data are experimented.
The research team is based at Politecnico di Milano and encompasses researchers with competencies in Computing Engineering, Communication and Information Design, Management Engineering, and Mathematics.
The aim of Urbanscope is to systematically produce compelling views on urban systems to foster understanding and decision making. Views are like new lenses of a macroscope: they are designed to support the recognition of specific patterns thus enabling new perspectives.
If you enjoyed the show, you can explore our beta application at:
Today I presented our full paper titled “Extracting Emerging Knowledge from Social Media” at the WWW 2017 conference.
The work is based on a rather obvious assumption, i.e., that knowledge in the world continuously evolves, and ontologies are largely incomplete for what concerns low-frequency data, belonging to the so-called long tail.
Socially produced content is an excellent source for discovering emerging knowledge: it is huge, and immediately reflects the relevant changes which hide emerging entities.
In the paper we propose a method and a tool for discovering emerging entities by extracting them from social media.
Once instrumented by experts through very simple initialization, the method is capable of finding emerging entities; we propose a mixed syntactic + semantic method. The method uses seeds, i.e. prototypes of emerging entities provided by experts, for generating candidates; then, it associates candidates to feature vectors, built by using terms occurring in their social content, and then ranks the candidates by using their distance from the centroid of seeds, returning the top candidates as result.
The method can be continuously or periodically iterated, using the results as new seeds.
Social media are getting more and more important in the context of live events, such as fairs, exhibits, festivals, concerts, and so on, as they play an essential role in communicating them to fans, interest groups, and the general population. These kinds of events are geo-localized within a city or territory and are scheduled within a public calendar.
Together with the people in the Fashion in Process group of Politecnico di Milano, we studied the impact on social media of a specific scenario, the Milano Fashion Week (MFW), which is an important event in Milano for the whole fashion business.
We focus our attention on the spreading of social content in space, measuring the spreading of the event propagation in space. We build different clusters of fashion brands, we characterize several features of propagation in space and we correlate them to the popularity of the brand and temporal propagation.
We show that the clusters along space, time and popularity dimensions are loosely correlated, and therefore trying to understand the dynamics of the events only based on popularity aspects would not be appropriate.
Daniele Quercia leads the Social Dynamics group at Bell Labs in Cambridge
(UK). He has been named one of Fortune magazine’s 2014 Data All-Stars, and spoke about “happy maps” at TED.His research has been focusing in the area of urban informatics and received best paper awards from Ubicomp 2014 and from ICWSM 2015, and an honourable mention from ICWSM 2013. He was Research Scientist at Yahoo Labs, a Horizon senior researcher at the University of Cambridge, and Postdoctoral Associate at the department of Urban Studies and Planning at MIT. He received his PhD from UC London. His thesis was sponsored by Microsoft Research and was nominated for BCS Best British PhD dissertation in Computer Science.
His presentation will contrast the corporate smart-city rhetoric about efficiency, predictability, and security with a different perspective on the cities, which I think is very inspiring and visionary.
“You’ll get to work on time; no queue when you go shopping, and you are safe because of CCTV cameras around you”. Well, all these things make a city acceptable, but they don’t make a city great.
Daniele is launching goodcitylife.org – a global group of like-minded people who are passionate about building technologies whose focus is not necessarily to create a smart city but to give a good life to city dwellers. The future of the city is, first and foremost, about people, and those people are increasingly networked. We will see how a creative use of network-generated data can tackle hitherto unanswered research questions. Can we rethink existing mapping tools [happy-maps]? Is it possible to capture smellscapes of entire cities and celebrate good odors [smelly-maps]? And soundscapes [chatty-maps]?
When people talk about smart cities, the tendency is to think about them in a technology-oriented or sociology-oriented manner.
However, smart cities are the places where we leave and work everyday now.
Here is a very broad perspective (in Italian) about the experience of big data analysis and smart city instrumentation for the town of Como, in Italy: an experience on how phone calls, mobility data, social media, people counters can contribute to take and evaluate decisions.
Within a completely new line of research, we are exploring the power of modeling for human behaviour analysis, especially within social networks and/or in occasion of large scale live events. Participation to challenges within social networks is a very effective instrument for promoting a brand or event and therefore it is regarded as an excellent marketing tool.
Our first reasearch has been published in November 2016 at WISE Conference, covering the analysis of user engagement within social network challenges.
In this paper, we take the challenge organizer’s perspective, and we study how to raise the
engagement of players in challenges where the players are stimulated to
create and evaluate content, thereby indirectly raising the awareness about the brand or event itself. Slides are available on slideshare:
We illustrate a comprehensive model of the actions and strategies that can be exploited for progressively boosting the social engagement during the challenge evolution. The model studies the organizer-driven management of interactions among players, and evaluates
the effectiveness of each action in light of several other factors (time, repetition, third party actions, interplay between different social networks, and so on).
We evaluate the model through a set of experiment upon a real case, the YourExpo2015 challenge. Overall, our experiments lasted 9 weeks and engaged around 800,000 users on two different social platforms; our quantitative analysis assesses the validity of the model.
Following up on my recent perspective that moves from model-driven development to hidden-model products, together with the Fluxedo team and in collaboration with WebRatio and Eurotech, we launched a new product called EventOmeters.
EventOmeters allows businesses and event organizers to increase the effectiveness of their events, involving participants and being able to rely on certain measures for the analysis of returns on investment in trade fairs, music, sports and in general of any gathering of people.
The role of the partners is as follows:
WebRatio is a leading provider of tools, methods and services for the rapid production of customized applications,
Fluxedo is an innovative start-up focusing on mobile app development, social network integration, and semantic social media analytics,
Eurotech will integrate data from IoT sensors whose data is made available realtime through cloud technology.
EventOmeters has been already used in the context of the FuoriSalone, within the Milano Design Week. In this setting, the product featured around 20.000 downloads of the official mobile app of the event and an analysis of more than 110.000 social media posts.
My recent interview on the evolution of social media and its role in modern society is available on YouTube (in Italian only, sorry about that).
While the 3+ minutes of speech necessarily had to be a general overview on the role and recent changes of social media, I wish to summarise here the some technical aspects of it.
As I mentioned in the presentation:
social media changed a lot since their early days, from being consumed on PCs to mobile devices, from general purpose social networks connecting friends to digital stages where we “sell” our life to the entire world, from places where to share personal information to platforms where to publish also objective information coming from the real world experience.
social media are nowadays a valuable source of information for companies, who look for (and find) their customers through social media marketing and advertising, and public institutions and researchers, that can leverage on a large amount of data for providing benefits to our everyday life
What I didn’t say is how you can do that. Well, it’s pretty simple.
The ingredients of the recipe:
A lot of users sharing their profile
A lot of content (photos, statuses, geotags, descriptions) shared by people
(which makes up a VERY big data problem)
crawlers capturing this (or stream capturing systems) and storage as needed
MODELS of the context, the problem and the solution
and DATA ANALYSIS TOOLS for studying the data and extracting meaningful information
To me, the most valuable points are MODELS and ANALYSIS TOOLS. We are doing a lot of experiments on mixing model-driven techniques with semantic analysis, NLP, and social media monitoring. One example of our experiments is the YourExpo2015 Instagram Photo Challenge.
Have a look and participate if you like. More on this coming soon!
To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).
At this point of the year, just before vacation time, it makes sense to me to think to Web Engineering practices at large and draw some trends and outlook for the field after this year. As a PC chair of ICWE 2012 (International Conference on Web Engineering), this year I can claim I had a privileged view over the entire event and I think this was a good test for assessing the field. Furthermore, being directly involved into the organization of MDWE workshop, I have been directly exposed to the specific aspects of the model-driven field for the Web. I see the following trends in the field:
still limited attention to non-functional aspects in Web application development. Honestly, this doesn’t make any sense to me and makes me think that this is one of the reasons why Web Engineering is still seen as a niche sector, both in industry and academia. Actually, at least some awareness starts appearing (e.g., some works on assessing the productivity of model-driven approaches have been discussed), but actual results are still very preliminary. And the Web is ALL ABOUT NON-FUNCTIONAL ISSUES!
mashups are still getting a lot of attention, as demonstrated by the successful ComposableWeb workshop too. However, I think we need to rethink a little bit this field. My feeling is that traditional mashups are no longer interesting per se. Even very well known solutions like Yahoo Pipes! have reached a limited audience, and in general mashups have never reached the maturity level that let them be used for producing enterprise-class professional applications. So, am I claiming that mashups are a complete failure? Not at all! They actually represent an important step that enabled the current trends toward Web API integration, which are probably used in most of the existing Web sites. I think that the mashup community in the future should look at the broad problem of Web API integration at large.
Finally, content and social analysis (both syntactical/textual and semantic) is getting more and more interest in the community. This is demonstrated by the wide set of well-attended ICWE tutorials that addressed these issues (The Web of Data for E-Commerce, Epidemic Intelligence for the Crowd, SPARQL and Queries over Linked Data, Natural Language Processing for the Web).
If you see some other interesting trends in Web Engineering, feel free to share your thoughts!
To keep updated on my activities you can subscribe to the RSS feed of my blog or follow my twitter account (@MarcoBrambi).