Do you want to get updates? Please join Data Sci­ence & Machine Learn­ing Newslet­ter Linked Group


  • Python mod­ule to per­form under sam­pling and over sam­pling with var­i­ous tech­niques
    • imbalanced-learn is a python pack­age offer­ing a num­ber of re-sampling tech­niques com­monly used in datasets show­ing strong between-class imbal­ance. It is com­pat­i­ble with scikit-learn and is part of scikit-learn-contrib projects.”
  • On Machine Learn­ing and Pro­gram­ming Lan­guages
    • While machine learn­ing does not yet have a ded­i­cated lan­guage, sev­eral efforts are effec­tively cre­at­ing hid­den new lan­guages under­neath a Python API (like Ten­sor­Flow) while oth­ers are reusing Python as a mod­el­ling lan­guage (like PyTorch). We’d like to ask – are new ML-tailored lan­guages required, and if so, why? More impor­tantly, what might the ideal ML lan­guage of the future look like?”
  • Deep Neu­roevo­lu­tion: Genetic Algo­rithms Are a Com­pet­i­tive Alter­na­tive for Train­ing Deep Neural Net­works for Rein­force­ment Learn­ing
    • We evolve the weights of a DNN with a sim­ple, gradient-free, population-based genetic algo­rithm (GA) and it per­forms well on hard deep RL prob­lems, includ­ing Atari and humanoid loco­mo­tion. The Deep GA suc­cess­fully evolves net­works with over four mil­lion free para­me­ters, the largest neural net­works ever evolved with a tra­di­tional evo­lu­tion­ary algorithm.”
  • MUSE — A library for Mul­ti­lin­gual Unsu­per­vised or Super­vised word Embed­dings
    • MUSE is a Python library for mul­ti­lin­gual word embed­dings, whose goal is to pro­vide the com­mu­nity with: *state-of-the-art mul­ti­lin­gual word embed­dings based on fast­Text, *large-scale high-quality bilin­gual dic­tio­nar­ies for train­ing and evaluation”
  • Deep Learn­ing Achieve­ments Over the Past Year
    dl_achivements

    Deep Learn­ing Achieve­ments Over the Past Year
    • Around Christ­mas time, our team decided to take stock of the recent achieve­ments in deep learn­ing over the past year (and a bit longer). We trans­lated the arti­cle by a data sci­en­tist, Ed Tyan­tov, to tell you about the most sig­nif­i­cant devel­op­ments that can affect our future.”
    • How to Improve my ML Algorithm?
    • You have worked for weeks on build­ing your machine learn­ing sys­tem and the per­for­mance is not some­thing you are sat­is­fied with. You think of mul­ti­ple ways to improve your algorithm’s per­for­mance, viz. col­lect more data, add more hid­den units, add more lay­ers, change the net­work archi­tec­ture, change the basic algo­rithm etc. But which one of these will give the best improve­ment on your sys­tem? You can either try them all, invest a lot of time and find out what works for you. OR! You can use the fol­low­ing tips from Ng’s experience“
  • Ten­sor­Flow for Short-Term Stocks Pre­dic­tion
    • In machine learn­ing, a con­vo­lu­tional neural net­work (CNN, or Con­vNet) is a class of neural net­works that has suc­cess­fully been applied to image recog­ni­tion and analy­sis. In this project I’ve approached this class of mod­els try­ing to apply it to stock mar­ket pre­dic­tion, com­bin­ing stock prices with sen­ti­ment analysis.”
  • 5 Tricks When AB Test­ing Is Off The Table
    • Here’s the good news: just because we can’t always AB test a major expe­ri­ence doesn’t mean we have to fly blind when it mat­ters most. A range of econo­met­ric meth­ods can illu­mi­nate the causal rela­tion­ships at play, pro­vid­ing action­able insights for the path forward.”