Sunday, July 4, 2010

ICML 2010 Highlights

636: Sparse Gaussian Process Regression via L1 Penalization
Feng Yan (Purdue University); Yuan Qi (Purdue University)
They introduced a way to do sparse GPs for large amounts of data by adding a L1 penalization to the influence of data points. It effectively removes irrelevant data points using a convex optimization. It avoids the local optima problems of normal sparse GPs.

I liked all the papers in the application track:
901: Web-Scale Bayesian Click-Through Rate Prediction for Sponsored Search Advertising in Microsoft's Bing Search Engine
Thore Graepel (Microsoft Research); Joaquin Quiñonero Candela (Microsoft Research); Thomas Borchert (Microsoft Research); Ralf Herbrich (Microsoft Research)

902: Detecting Large-Scale System Problems by Mining Console Logs
Wei Xu (UC Berkeley); Ling Huang (Intel Labs Berkeley); Armando Fox (UC Berkeley); David Patterson (UC Berkeley); Michael I. Jordan (UC Berkeley)
I liked this since it is somewhat related to my project.

903: The Role of Machine Learning in Business Optimization
Chid Apte (IBM T. J. Watson Research Center)
IBM is increasing the efficiency of collecting back taxes in NY state using machine learning (which some people found scary).

374: Local Minima Embedding
Minyoung Kim (CMU); Fernando De la Torre (CMU)
The idea is to visualize the a high dimensional objective function in a lower dimension that can be visualized while preserving the local optima. Its a really good idea, but it is not good enough to help with the hard problems we would want to solve with it (ie visualizing local optima in high dimensional neural network optimization).

495: Hilbert Space Embeddings of Hidden Markov Models
Le Song (Cmu); Byron Boots (Carnegie Mellon University); Sajid Siddiqi (Google); Geoffrey Gordon (Carnegie Mellon University); Alex Smola (Yahoo! Research)

551: Distance Dependent Chinese Restaurant Processes
David Blei (Princeton University); Peter Frazier (Cornell)

19 comments:

JonasTabr有香 said...

It takes all kinds to make a world.............................................................

王雅筑 said...

什麼樣的學習計畫並不重要,重要的是你是什麼樣的人。............................................................

蕙春蕙春 said...

一個人的際遇在第一次總是最深刻的,有時候甚至會讓人的心變成永遠的絕緣。............................................................

吳婷婷 said...

很用心的blog~很喜歡~願您一切順心..................................................................

子帆子帆 said...

部落格很棒唷~ 支持你歐^^..................................................................

陳韋夏陳韋夏益東富益東富 said...

單純喜歡你的部落格,希望你能收到我的感謝 ^^............................................................

姿柯瑩柯dgdd憶曾g智曾 said...

不要去想沒拿到的東西,多想想自己手裡所擁有的..................................................

琬安琬安 said...

道歉是人類一定必要的禮節..................................................

李凱翔李凱翔 said...

He who would climb the ladder must begin at the bottom...................................................

懿馮綺懿馮綺懿馮綺懿馮綺懿馮綺 said...

Learning makes a good man better and ill man worse.............................................................

雅王任 said...

Since it is the early worm that gets eaten by the bird, sleep late.............................................................

姿胤綸婷 said...

Many a true word is spoken in jest..................................................................

黃以陳美苓富 said...

良言一句三冬暖,惡語傷人六月寒。......................................................................

雅孫婷雅孫婷雅孫婷 said...

我愛那些使自己的德行成為自己的目標或命定的人................................................

8468 said...

耐心是一株很苦的植物,但果實卻很甜美。..................................................

木詹詹詹詹詹行詹詹 said...

世間事沒有一樣沒有困難,只要有信心去做,至少可以做出一些成績。..................................................

王志陳雅竹豪 said...

所有的資產,在不被諒解時,都成了負債............................................................

佳張張張張燕張張張張張 said...

辛苦了!祝你愈來愈好!............................................................

尚铭 said...

用心經營blog,人氣百分百~^^ 加油