You may have noticed that the postings in this blog become scarcer even though it was already scarce before. This doesn’t mean that I have lost the interest in writing posts here. Instead, the sabbatical period (in another sabbatical period) has brought me a lot of new findings, interesting and exciting things to explore. Here, in this post, I write some of the stuff you may find yourself curious about too.
Wolfram the Knowledge Engine and Knowledge Discovery
Some have speculated that Wolfram will be the Google-killer although by purpose they are a bit different (and the Wolfram dev team also clearly mentioned about this on their site). Wolfram is a knowledge engine with its own data repositories which is capable of answering questions to very diverse domains. Different with search engines in which data are collected from various sources and then ranked based on relevance, utilizing page rank algorithms, the knowledge engine stores data that are “factual” and representative for the knowledge. For each given question, it computes the answer based on the model that represents the question. Two pertinent fields to this application are machine learning and data mining. Key issue in developing application like Wolfram is that it should be smart enough to interpret the question and provide relevant and factual answers thus less being subjective.
Wolfram itself is an implementation of techniques in Knowledge Discovery (KD). This field is also related with data mining and machine learning. Wolfram tasks are pertinent to collecting and segregating knowledge from knowledge base. The engine basically forwards inputs to the most appropriate or representative knowledge model and then return the output based on the processing by the model engine. What if we only have data and want to discover knowledge behind it? This is where the phase of model inference or creation takes place. Model for the knowledge can vary based on the interpretation of the data. Hence, it’s always interesting in deriving models from supplied data.
I played with Wolfram and found out that it’s a nice tool for answering mathematical problems ranging from simple arithmetic to advanced calculus, providing statistical data, scientific data, historical data, and also some socio-factual data like quotes from films. Given the fact that I actually wrote this post several months before this post is finally published will apparently mean that Wolfram has also (hopefully) been better due to criticisms since its initial launch.
Just like other critics, I spotted two most challenging problem Wolfram face in its alpha stage (I started experimenting with Wolfram Alpha in May this year). The first one was processing time. Sometimes, the engine returned answer more than five seconds. I speculate it was caused by either congestion (queues) in the compute grid infrastructure or the nature of the question which needed some time before finishing the computation for the answers. The latter was inconsistent/unexpected answers and unanswered questions. In my opinion, it was caused by unsupported models or different assumptions made by the engine for the question. Despite this shortcoming, I still believe that Wolfram will grow big in the near future given the conditions that it implements more robust and active data mining techniques and extension to its current models.
Cloud computing is a pervasive internet buzz. On the internet, several definitions of cloud computing afloat and it seems that there is no single consensus about what is referred to as cloud computing and what is not. Academician represented by Ian Foster defines cloud computing as a large-scale distributed computing paradigm that is driven by economics of scale, in which a pool of abstracted, virtualized, dynamically-scalable, managed computing power, storage, platforms, and services are delivered on demand to external customers over the internet. One who comes from extensive management background may feel this definition too cryptic or too much sugar-coated. Thus, we can easily find more straightforward and plain-vanilla definitions from the enterprises. Intel, for example, defines cloud computing as a computing paradigm where services and data reside in shared resources in scalable data centers, and those services and data are accessible by any authenticated device over the internet.
Despite the differing definitions, academician and enterprises perceive the same perspective about certain characteristics of cloud computing, which are shared resources, distributed, and on-demand (which later infers that resources are available on contract basis). Cloud computing offers new paradigm for managing, storing, and utilizing resources in the networks. I did some researches about cloud computing and managed to publish my paper about future of today’s cloud computing. I may also write follow-up writings about cloud computing here. Given current trend in the industry and academician, I am positive that cloud computing will lead the evolution towards future distributed systems.
Ubiquitous Networks (Systems)
Ubiquitous networks in a simple term refer to the networks that can connect anyone and anything at anywhere. This implies a situation in which ICT becomes a part of the social culture. It will be used by people in daily life, has been successfully integrated with daily life, and is easily accessible.
Talking about ubiquitous trend nowadays, the era of IP-based sensor networks is about to come. Ubiquitous networks will enable access, management, control of various devices using the pervasive TCP/IP(v6) network stack. In the past ten years, we have seen how researches tried to integrate various networked devices using custom network stacks without using TCP/IP. However, as IPv6 becomes more widely implemented and adopted, it won’t be too long until we are able to control various home instruments from the browsers installed in our office desktop.
I feel lucky to see Korea develop its ubiquitous systems, covering basic ubiquitous systems like smart home, smart office to larger systems like mass transportation and even ubiquitous government. For me, Korea is a very nice example of quick adaptation and development of technologies that can reach people in daily lives and give better tangible benefits to them.