今年雅虎校招看到的部分题目
Section C:
2. insert interval (leetCode)
@喵星人与汪星人这篇http://huntfor.iteye.com/blog/2085095
3. Inerleaving (leetCode)
还是@喵星人与汪星人http://huntfor.iteye.com/blog/2086539, 引了一位FB大神的代码,很简洁。
这是GeeksforGeeks的 http://www.geeksforgeeks.org/check-whether-a-given-string-is-an-interleaving-of-two-other-given-strings-set-2/
这个有个思辨的过程,有启发 http://www.cnblogs.com/lichen782/p/leetcode_interleaving_string.html
Section B:
11. void *pStr; myStruct myArray[10]; pStr = myArray; 问pStr怎么增加?
选项记不清了,贴一篇巩固下指针http://m.blog.csdn.net/blog/tomlingyu/4013376
12. 任意数制转换
就是这段代码http://stackoverflow.com/questions/8889733/converting-number-into-different-notations
Section D:
8 points] True or False? If true, explain why in at most two sentences. If false, explain why
or give a brief counterexample in at most two sentences.
² (True or False?) The error of a hypothesis measured over its training set provides a
pessimistically biased estimate of the true error of the hypothesis.
Solutions:
False. The training error is optimisticly biased since it's biased while usually smaller
than the true error.
² (True or False?) If you are given m data points, and use half for training and half
for testing, the di®erence between training error and test error decreases as m increases.
Solutions:
True. As we have more and more data, training error increases and testing error decreases. And they all converge to the true error.
² (True or False?) Over¯tting is more likely when the set of training data is small
Solutions:
True. With small training dataset, it's easier to ¯nd a hypothesis to ¯t the training data
exactly,i.e., over¯t.
² (True or False?) Over¯tting is more likely when the hypothesis space is small
Solutions:
False. We can see this from the bias-variance trade-o®. When hypothesis space is small,
it's more biased with less variance. So with a small hypothesis space, it's less likely to
¯nd a hypothesis to ¯t the data very well,i.e., over¯t
发现是十年前CMU应该是ML专业他们的其中考题,⊙﹏⊙b汗死!
3. 关于最大似然估计的推导题,和wiki的例子很像http://zh.wikipedia.org/zh-cn/%E6%9C%80%E5%A4%A7%E4%BC%BC%E7%84%B6%E4%BC%B0%E8%AE%A1