323 lines
86 KiB
Plaintext
323 lines
86 KiB
Plaintext
[38;5;12m [39m[38;5;14m[1m[4mAwesome[0m[38;5;14m[1m[4m [0m[38;5;14m[1m[4mDeep[0m[38;5;14m[1m[4m [0m[38;5;14m[1m[4mLearning[0m[38;5;14m[1m[4m [0m[38;5;14m[1m[4mResources[0m[38;2;255;187;0m[1m[4m [0m[38;2;255;187;0m[1m[4m(https://github.com/guillaume-chevalier/Awesome-Deep-Learning-Resources)[0m[38;2;255;187;0m[1m[4m [0m[38;5;14m[1m[4m![0m[38;2;255;187;0m[1m[4mAwesome[0m[38;5;14m[1m[4m [0m[38;5;14m[1m[4m(https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)[0m[38;2;255;187;0m[1m[4m [0m
|
||
[38;5;12m [39m[38;2;255;187;0m[1m[4m(https://github.com/sindresorhus/awesome)[0m
|
||
|
||
[38;5;12mThis is a rough list of my favorite deep learning resources. It has been useful to me for learning how to do deep learning, I use it for revisiting topics or for reference.[39m
|
||
[38;5;12mI ([39m[38;5;14m[1mGuillaume Chevalier[0m[38;5;12m (https://github.com/guillaume-chevalier)) have built this list and got through all of the content listed here, carefully.[39m
|
||
|
||
|
||
[38;2;255;187;0m[4mContents[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mTrends[0m[38;5;12m (#trends)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mOnline classes[0m[38;5;12m (#online-classes)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mBooks[0m[38;5;12m (#books)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mPosts and Articles[0m[38;5;12m (#posts-and-articles)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mPractical resources[0m[38;5;12m (#practical-resources)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mLibrairies and Implementations[0m[38;5;12m (#librairies-and-implementations)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mSome Datasets[0m[38;5;12m (#some-datasets)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mOther Math Theory[0m[38;5;12m (#other-math-theory)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mGradient Descent Algorithms and optimization[0m[38;5;12m (#gradient-descent-algorithms-and-optimization)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mComplex Numbers & Digital Signal Processing[0m[38;5;12m (#complex-numbers-and-digital-signal-processing)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mPapers[0m[38;5;12m (#papers)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mRecurrent Neural Networks[0m[38;5;12m (#recurrent-neural-networks)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mConvolutional Neural Networks[0m[38;5;12m (#convolutional-neural-networks)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mAttention Mechanisms[0m[38;5;12m (#attention-mechanisms)[39m
|
||
[38;5;12m - [39m[38;5;14m[1mOther[0m[38;5;12m (#other)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mYouTube and Videos[0m[38;5;12m (#youtube)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mMisc. Hubs and Links[0m[38;5;12m (#misc-hubs-and-links)[39m
|
||
[38;5;12m- [39m[38;5;14m[1mLicense[0m[38;5;12m (#license)[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mTrends[0m
|
||
|
||
[38;5;12mHere are the all-time [39m[38;5;14m[1mGoogle Trends[0m[38;5;12m (https://www.google.ca/trends/explore?date=all&q=machine%20learning,deep%20learning,data%20science,computer%20programming), from 2004 up to now, September 2017:[39m
|
||
|
||
[38;5;12m [39m
|
||
|
||
|
||
[38;5;12mYou might also want to look at Andrej Karpathy's [39m[38;5;14m[1mnew post[0m[38;5;12m (https://medium.com/@karpathy/a-peek-at-trends-in-machine-learning-ab8a1085a106) about trends in Machine Learning research.[39m
|
||
|
||
[38;5;12mI believe that Deep learning is the key to make computers think more like humans, and has a lot of potential. Some hard automation tasks can be solved easily with that while this was impossible to achieve earlier with classical algorithms.[39m
|
||
|
||
[38;5;12mMoore's Law about exponential progress rates in computer science hardware is now more affecting GPUs than CPUs because of physical limits on how tiny an atomic transistor can be. We are shifting toward parallel architectures[39m
|
||
[38;5;12mread[39m[38;5;12m [39m[38;5;12mmore[39m[38;5;14m[1m [0m[38;5;14m[1m(https://www.quora.com/Does-Moores-law-apply-to-GPUs-Or-only-CPUs)[0m[38;5;12m [39m[38;5;12m.[39m[38;5;12m [39m[38;5;12mDeep[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mexploits[39m[38;5;12m [39m[38;5;12mparallel[39m[38;5;12m [39m[38;5;12marchitectures[39m[38;5;12m [39m[38;5;12mas[39m[38;5;12m [39m[38;5;12msuch[39m[38;5;12m [39m[38;5;12munder[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mhood[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12musing[39m[38;5;12m [39m[38;5;12mGPUs.[39m[38;5;12m [39m[38;5;12mOn[39m[38;5;12m [39m[38;5;12mtop[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthat,[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12malgorithms[39m[38;5;12m [39m[38;5;12mmay[39m[38;5;12m [39m[38;5;12muse[39m[38;5;12m [39m[38;5;12mQuantum[39m[38;5;12m [39m[38;5;12mComputing[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mapply[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m
|
||
[38;5;12mmachine-brain[39m[38;5;12m [39m[38;5;12minterfaces[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mfuture.[39m
|
||
|
||
[38;5;12mI find that the key of intelligence and cognition is a very interesting subject to explore and is not yet well understood. Those technologies are promising.[39m
|
||
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mOnline Classes[0m
|
||
|
||
[38;5;12m- [39m[38;5;12mDL&RNN Course[39m[38;5;14m[1m (https://www.dl-rnn-course.neuraxio.com/start?utm_source=github_awesome) - I created this richely dense course on Deep Learning and Recurrent Neural Networks.[0m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mAndrew[0m[38;5;14m[1m [0m[38;5;14m[1mNg[0m[38;5;14m[1m [0m[38;5;14m[1mon[0m[38;5;14m[1m [0m[38;5;14m[1mCoursera[0m[38;5;12m [39m[38;5;12m(https://www.coursera.org/learn/machine-learning)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mRenown[39m[38;5;12m [39m[38;5;12mentry-level[39m[38;5;12m [39m[38;5;12monline[39m[38;5;12m [39m[38;5;12mclass[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;14m[1mcertificate[0m[38;5;12m [39m[38;5;12m(https://www.coursera.org/account/accomplishments/verify/DXPXHYFNGKG3).[39m[38;5;12m [39m[38;5;12mTaught[39m[38;5;12m [39m[38;5;12mby:[39m[38;5;12m [39m[38;5;12mAndrew[39m[38;5;12m [39m[38;5;12mNg,[39m[38;5;12m [39m
|
||
[38;5;12mAssociate[39m[38;5;12m [39m[38;5;12mProfessor,[39m[38;5;12m [39m[38;5;12mStanford[39m[38;5;12m [39m[38;5;12mUniversity;[39m[38;5;12m [39m[38;5;12mChief[39m[38;5;12m [39m[38;5;12mScientist,[39m[38;5;12m [39m[38;5;12mBaidu;[39m[38;5;12m [39m[38;5;12mChairman[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mCo-founder,[39m[38;5;12m [39m[38;5;12mCoursera.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mSpecialization[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mAndrew[0m[38;5;14m[1m [0m[38;5;14m[1mNg[0m[38;5;14m[1m [0m[38;5;14m[1mon[0m[38;5;14m[1m [0m[38;5;14m[1mCoursera[0m[38;5;12m [39m[38;5;12m(https://www.coursera.org/specializations/deep-learning)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mNew[39m[38;5;12m [39m[38;5;12mseries[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12m5[39m[38;5;12m [39m[38;5;12mDeep[39m[38;5;12m [39m[38;5;12mLearning[39m[38;5;12m [39m[38;5;12mcourses[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mAndrew[39m[38;5;12m [39m[38;5;12mNg,[39m[38;5;12m [39m[38;5;12mnow[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mPython[39m[38;5;12m [39m[38;5;12mrather[39m[38;5;12m [39m[38;5;12mthan[39m[38;5;12m [39m[38;5;12mMatlab/Octave,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mwhich[39m[38;5;12m [39m[38;5;12mleads[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;14m[1mspecialization[0m[38;5;14m[1m [0m
|
||
[38;5;14m[1mcertificate[0m[38;5;12m [39m[38;5;12m(https://www.coursera.org/account/accomplishments/specialization/U7VNC3ZD9YD8).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning by Google[0m[38;5;12m (https://www.udacity.com/course/deep-learning--ud730) - Good intermediate to advanced-level course covering high-level deep learning concepts, I found it helps to get creative once the basics are acquired.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mTrading[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mGeorgia[0m[38;5;14m[1m [0m[38;5;14m[1mTech[0m[38;5;12m [39m[38;5;12m(https://www.udacity.com/course/machine-learning-for-trading--ud501)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mInteresting[39m[38;5;12m [39m[38;5;12mclass[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12macquiring[39m[38;5;12m [39m[38;5;12mbasic[39m[38;5;12m [39m[38;5;12mknowledge[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mmachine[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mapplied[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mtrading[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12msome[39m[38;5;12m [39m[38;5;12mAI[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mfinance[39m[38;5;12m [39m[38;5;12mconcepts.[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m
|
||
[38;5;12mespecially[39m[38;5;12m [39m[38;5;12mliked[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12msection[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mQ-Learning.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mnetworks[0m[38;5;14m[1m [0m[38;5;14m[1mclass[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mHugo[0m[38;5;14m[1m [0m[38;5;14m[1mLarochelle,[0m[38;5;14m[1m [0m[38;5;14m[1mUniversité[0m[38;5;14m[1m [0m[38;5;14m[1mde[0m[38;5;14m[1m [0m[38;5;14m[1mSherbrooke[0m[38;5;12m [39m[38;5;12m(https://www.youtube.com/playlist?list=PL6Xpj9I5qXYEcOhn7TqghAJ6NAPrNmUBH)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mInteresting[39m[38;5;12m [39m[38;5;12mclass[39m[38;5;12m [39m[38;5;12mabout[39m[38;5;12m [39m[38;5;12mneural[39m[38;5;12m [39m[38;5;12mnetworks[39m[38;5;12m [39m[38;5;12mavailable[39m[38;5;12m [39m[38;5;12monline[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mfree[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mHugo[39m[38;5;12m [39m[38;5;12mLarochelle,[39m[38;5;12m [39m[38;5;12myet[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mhave[39m[38;5;12m [39m
|
||
[38;5;12mwatched[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mfew[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthose[39m[38;5;12m [39m[38;5;12mvideos.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mGLO-4030/7030[0m[38;5;14m[1m [0m[38;5;14m[1mApprentissage[0m[38;5;14m[1m [0m[38;5;14m[1mpar[0m[38;5;14m[1m [0m[38;5;14m[1mréseaux[0m[38;5;14m[1m [0m[38;5;14m[1mde[0m[38;5;14m[1m [0m[38;5;14m[1mneurones[0m[38;5;14m[1m [0m[38;5;14m[1mprofonds[0m[38;5;12m [39m[38;5;12m(https://ulaval-damas.github.io/glo4030/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mThis[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mclass[39m[38;5;12m [39m[38;5;12mgiven[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mPhilippe[39m[38;5;12m [39m[38;5;12mGiguère,[39m[38;5;12m [39m[38;5;12mProfessor[39m[38;5;12m [39m[38;5;12mat[39m[38;5;12m [39m[38;5;12mUniversity[39m[38;5;12m [39m[38;5;12mLaval.[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mespecially[39m[38;5;12m [39m[38;5;12mfound[39m[38;5;12m [39m[38;5;12mawesome[39m[38;5;12m [39m[38;5;12mits[39m[38;5;12m [39m[38;5;12mrare[39m[38;5;12m [39m[38;5;12mvisualization[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m
|
||
[38;5;12mmulti-head[39m[38;5;12m [39m[38;5;12mattention[39m[38;5;12m [39m[38;5;12mmechanism,[39m[38;5;12m [39m[38;5;12mwhich[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mcontemplated[39m[38;5;12m [39m[38;5;12mat[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;14m[1mslide[0m[38;5;14m[1m [0m[38;5;14m[1m28[0m[38;5;14m[1m [0m[38;5;14m[1mof[0m[38;5;14m[1m [0m[38;5;14m[1mweek[0m[38;5;14m[1m [0m[38;5;14m[1m13's[0m[38;5;14m[1m [0m[38;5;14m[1mclass[0m[38;5;12m [39m[38;5;12m(http://www2.ift.ulaval.ca/~pgiguere/cours/DeepLearning/09-Attention.pdf).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning & Recurrent Neural Networks (DL&RNN)[0m[38;5;12m (https://www.neuraxio.com/en/time-series-solution) - The most richly dense, accelerated course on the topic of Deep Learning & Recurrent Neural Networks (scroll at the end).[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mBooks[0m
|
||
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mClean[0m[38;5;14m[1m [0m[38;5;14m[1mCode[0m[38;5;12m [39m[38;5;12m(https://www.amazon.ca/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mGet[39m[38;5;12m [39m[38;5;12mback[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mbasics[39m[38;5;12m [39m[38;5;12myou[39m[38;5;12m [39m[38;5;12mfool![39m[38;5;12m [39m[38;5;12mLearn[39m[38;5;12m [39m[38;5;12mhow[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mdo[39m[38;5;12m [39m[38;5;12mClean[39m[38;5;12m [39m[38;5;12mCode[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12myour[39m[38;5;12m [39m[38;5;12mcareer.[39m[38;5;12m [39m[38;5;12mThis[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mfar[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mbest[39m[38;5;12m [39m[38;5;12mbook[39m[38;5;12m [39m[38;5;12mI've[39m[38;5;12m [39m[38;5;12mread[39m[38;5;12m [39m[38;5;12meven[39m[38;5;12m [39m[38;5;12mif[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mlist[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mrelated[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m
|
||
[38;5;12mDeep[39m[38;5;12m [39m[38;5;12mLearning.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mClean Coder[0m[38;5;12m (https://www.amazon.ca/Clean-Coder-Conduct-Professional-Programmers/dp/0137081073) - Learn how to be professional as a coder and how to interact with your manager. This is important for any coding career.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mHow to Create a Mind[0m[38;5;12m (https://www.amazon.com/How-Create-Mind-Thought-Revealed/dp/B009VSFXZ4) - The audio version is nice to listen to while commuting. This book is motivating about reverse-engineering the mind and thinking on how to code AI.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Networks and Deep Learning[0m[38;5;12m (http://neuralnetworksanddeeplearning.com/index.html) - This book covers many of the core concepts behind neural networks and deep learning.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning - An MIT Press book[0m[38;5;12m (http://www.deeplearningbook.org/) - Yet halfway through the book, it contains satisfying math content on how to think about actual deep learning.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSome[0m[38;5;14m[1m [0m[38;5;14m[1mother[0m[38;5;14m[1m [0m[38;5;14m[1mbooks[0m[38;5;14m[1m [0m[38;5;14m[1mI[0m[38;5;14m[1m [0m[38;5;14m[1mhave[0m[38;5;14m[1m [0m[38;5;14m[1mread[0m[38;5;12m [39m[38;5;12m(https://books.google.ca/books?hl=en&as_coll=4&num=100&uid=103409002069648430166&source=gbs_slider_cls_metadata_4_mylibrary_title)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mSome[39m[38;5;12m [39m[38;5;12mbooks[39m[38;5;12m [39m[38;5;12mlisted[39m[38;5;12m [39m[38;5;12mhere[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m[38;5;12mless[39m[38;5;12m [39m[38;5;12mrelated[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mbut[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m[38;5;12mstill[39m[38;5;12m [39m[38;5;12msomehow[39m[38;5;12m [39m
|
||
[38;5;12mrelevant[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mlist.[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mPosts and Articles[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mPredictions made by Ray Kurzweil[0m[38;5;12m (https://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil) - List of mid to long term futuristic predictions made by Ray Kurzweil.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mThe[0m[38;5;14m[1m [0m[38;5;14m[1mUnreasonable[0m[38;5;14m[1m [0m[38;5;14m[1mEffectiveness[0m[38;5;14m[1m [0m[38;5;14m[1mof[0m[38;5;14m[1m [0m[38;5;14m[1mRecurrent[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;12m [39m[38;5;12m(http://karpathy.github.io/2015/05/21/rnn-effectiveness/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mMUST[39m[38;5;12m [39m[38;5;12mREAD[39m[38;5;12m [39m[38;5;12mpost[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mAndrej[39m[38;5;12m [39m[38;5;12mKarpathy[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mwhat[39m[38;5;12m [39m[38;5;12mmotivated[39m[38;5;12m [39m[38;5;12mme[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mlearn[39m[38;5;12m [39m[38;5;12mRNNs,[39m[38;5;12m [39m[38;5;12mit[39m[38;5;12m [39m[38;5;12mdemonstrates[39m[38;5;12m [39m[38;5;12mwhat[39m[38;5;12m [39m[38;5;12mit[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12machieve[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mmost[39m
|
||
[38;5;12mbasic[39m[38;5;12m [39m[38;5;12mform[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mNLP.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Networks, Manifolds, and Topology[0m[38;5;12m (http://colah.github.io/posts/2014-03-NN-Manifolds-Topology/) - Fresh look on how neurons map information.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mUnderstanding LSTM Networks[0m[38;5;12m (http://colah.github.io/posts/2015-08-Understanding-LSTMs/) - Explains the LSTM cells' inner workings, plus, it has interesting links in conclusion.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAttention and Augmented Recurrent Neural Networks[0m[38;5;12m (http://distill.pub/2016/augmented-rnns/) - Interesting for visual animations, it is a nice intro to attention mechanisms as an example.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mRecommending music on Spotify with deep learning[0m[38;5;12m (http://benanne.github.io/2014/08/05/spotify-cnns.html) - Awesome for doing clustering on audio - post by an intern at Spotify.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAnnouncing SyntaxNet: The World’s Most Accurate Parser Goes Open Source[0m[38;5;12m (https://research.googleblog.com/2016/05/announcing-syntaxnet-worlds-most.html) - Parsey McParseface's birth, a neural syntax tree parser.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mImproving[0m[38;5;14m[1m [0m[38;5;14m[1mInception[0m[38;5;14m[1m [0m[38;5;14m[1mand[0m[38;5;14m[1m [0m[38;5;14m[1mImage[0m[38;5;14m[1m [0m[38;5;14m[1mClassification[0m[38;5;14m[1m [0m[38;5;14m[1min[0m[38;5;14m[1m [0m[38;5;14m[1mTensorFlow[0m[38;5;12m [39m[38;5;12m(https://research.googleblog.com/2016/08/improving-inception-and-image.html)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mVery[39m[38;5;12m [39m[38;5;12minteresting[39m[38;5;12m [39m[38;5;12mCNN[39m[38;5;12m [39m[38;5;12marchitecture[39m[38;5;12m [39m[38;5;12m(e.g.:[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12minception-style[39m[38;5;12m [39m[38;5;12mconvolutional[39m[38;5;12m [39m[38;5;12mlayers[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mpromising[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m
|
||
[38;5;12mefficient[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mterms[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mreducing[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mnumber[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mparameters).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mWaveNet: A Generative Model for Raw Audio[0m[38;5;12m (https://deepmind.com/blog/wavenet-generative-model-raw-audio/) - Realistic talking machines: perfect voice generation.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mFrançois Chollet's Twitter[0m[38;5;12m (https://twitter.com/fchollet) - Author of Keras - has interesting Twitter posts and innovative ideas.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeuralink and the Brain’s Magical Future[0m[38;5;12m (http://waitbutwhy.com/2017/04/neuralink.html) - Thought provoking article about the future of the brain and brain-computer interfaces.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mMigrating[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mGit[0m[38;5;14m[1m [0m[38;5;14m[1mLFS[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mDeveloping[0m[38;5;14m[1m [0m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mApplications[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mLarge[0m[38;5;14m[1m [0m[38;5;14m[1mFiles[0m[38;5;12m [39m[38;5;12m(http://vooban.com/en/tips-articles-geek-stuff/migrating-to-git-lfs-for-developing-deep-learning-applications-with-large-files/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mEasily[39m[38;5;12m [39m[38;5;12mmanage[39m[38;5;12m [39m[38;5;12mhuge[39m[38;5;12m [39m[38;5;12mfiles[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12myour[39m[38;5;12m [39m
|
||
[38;5;12mprivate[39m[38;5;12m [39m[38;5;12mGit[39m[38;5;12m [39m[38;5;12mprojects.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mThe future of deep learning[0m[38;5;12m (https://blog.keras.io/the-future-of-deep-learning.html) - François Chollet's thoughts on the future of deep learning.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDiscover structure behind data with decision trees[0m[38;5;12m (http://vooban.com/en/tips-articles-geek-stuff/discover-structure-behind-data-with-decision-trees/) - Grow decision trees and visualize them, infer the hidden logic behind data.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mHyperopt[0m[38;5;14m[1m [0m[38;5;14m[1mtutorial[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mOptimizing[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks’[0m[38;5;14m[1m [0m[38;5;14m[1mHyperparameters[0m[38;5;12m [39m[38;5;12m(http://vooban.com/en/tips-articles-geek-stuff/hyperopt-tutorial-for-optimizing-neural-networks-hyperparameters/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mLearn[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mslay[39m[38;5;12m [39m[38;5;12mdown[39m[38;5;12m [39m[38;5;12mhyperparameter[39m[38;5;12m [39m[38;5;12mspaces[39m[38;5;12m [39m[38;5;12mautomatically[39m[38;5;12m [39m[38;5;12mrather[39m[38;5;12m [39m
|
||
[38;5;12mthan[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mhand.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mEstimating[0m[38;5;14m[1m [0m[38;5;14m[1man[0m[38;5;14m[1m [0m[38;5;14m[1mOptimal[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mRate[0m[38;5;14m[1m [0m[38;5;14m[1mFor[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetwork[0m[38;5;12m [39m[38;5;12m(https://medium.com/@surmenok/estimating-optimal-learning-rate-for-a-deep-neural-network-ce32f2556ce0)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mClever[39m[38;5;12m [39m[38;5;12mtrick[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mestimate[39m[38;5;12m [39m[38;5;12man[39m[38;5;12m [39m[38;5;12moptimal[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mrate[39m[38;5;12m [39m[38;5;12mprior[39m[38;5;12m [39m[38;5;12many[39m[38;5;12m [39m[38;5;12msingle[39m[38;5;12m [39m[38;5;12mfull[39m[38;5;12m [39m
|
||
[38;5;12mtraining.[39m
|
||
[38;5;12m - [39m[38;5;14m[1mThe Annotated Transformer[0m[38;5;12m (http://nlp.seas.harvard.edu/2018/04/03/attention.html) - Good for understanding the "Attention Is All You Need" (AIAYN) paper. [39m
|
||
[38;5;12m - [39m[38;5;14m[1mThe Illustrated Transformer[0m[38;5;12m (http://jalammar.github.io/illustrated-transformer/) - Also good for understanding the "Attention Is All You Need" (AIAYN) paper.[39m
|
||
[38;5;12m - [39m[38;5;14m[1mImproving Language Understanding with Unsupervised Learning[0m[38;5;12m (https://blog.openai.com/language-unsupervised/) - SOTA across many NLP tasks from unsupervised pretraining on huge corpus.[39m
|
||
[38;5;12m - [39m[38;5;14m[1mNLP's ImageNet moment has arrived[0m[38;5;12m (https://thegradient.pub/nlp-imagenet/) - All hail NLP's ImageNet moment. [39m
|
||
[38;5;12m - [39m[38;5;14m[1mThe Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)[0m[38;5;12m (https://jalammar.github.io/illustrated-bert/) - Understand the different approaches used for NLP's ImageNet moment. [39m
|
||
[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mUncle[0m[38;5;14m[1m [0m[38;5;14m[1mBob's[0m[38;5;14m[1m [0m[38;5;14m[1mPrinciples[0m[38;5;14m[1m [0m[38;5;14m[1mOf[0m[38;5;14m[1m [0m[38;5;14m[1mOOD[0m[38;5;12m [39m[38;5;12m(http://butunclebob.com/ArticleS.UncleBob.PrinciplesOfOod)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mNot[39m[38;5;12m [39m[38;5;12monly[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mSOLID[39m[38;5;12m [39m[38;5;12mprinciples[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m[38;5;12mneeded[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mdoing[39m[38;5;12m [39m[38;5;12mclean[39m[38;5;12m [39m[38;5;12mcode,[39m[38;5;12m [39m[38;5;12mbut[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mfurtherless[39m[38;5;12m [39m[38;5;12mknown[39m[38;5;12m [39m[38;5;12mREP,[39m[38;5;12m [39m[38;5;12mCCP,[39m[38;5;12m [39m[38;5;12mCRP,[39m[38;5;12m [39m[38;5;12mADP,[39m[38;5;12m [39m[38;5;12mSDP[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mSAP[39m[38;5;12m [39m[38;5;12mprinciples[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m[38;5;12mvery[39m[38;5;12m [39m[38;5;12mimportant[39m[38;5;12m [39m
|
||
[38;5;12mfor[39m[38;5;12m [39m[38;5;12mdevelopping[39m[38;5;12m [39m[38;5;12mhuge[39m[38;5;12m [39m[38;5;12msoftware[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m[38;5;12mmust[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mbundled[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mdifferent[39m[38;5;12m [39m[38;5;12mseparated[39m[38;5;12m [39m[38;5;12mpackages.[39m
|
||
[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mWhy[0m[38;5;14m[1m [0m[38;5;14m[1mdo[0m[38;5;14m[1m [0m[38;5;14m[1m87%[0m[38;5;14m[1m [0m[38;5;14m[1mof[0m[38;5;14m[1m [0m[38;5;14m[1mdata[0m[38;5;14m[1m [0m[38;5;14m[1mscience[0m[38;5;14m[1m [0m[38;5;14m[1mprojects[0m[38;5;14m[1m [0m[38;5;14m[1mnever[0m[38;5;14m[1m [0m[38;5;14m[1mmake[0m[38;5;14m[1m [0m[38;5;14m[1mit[0m[38;5;14m[1m [0m[38;5;14m[1minto[0m[38;5;14m[1m [0m[38;5;14m[1mproduction?[0m[38;5;12m [39m[38;5;12m(https://venturebeat.com/2019/07/19/why-do-87-of-data-science-projects-never-make-it-into-production/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mData[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mnot[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12moverlooked,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mcommunication[39m[38;5;12m [39m[38;5;12mbetween[39m[38;5;12m [39m[38;5;12mteams[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mdata[39m[38;5;12m [39m
|
||
[38;5;12mscientists[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mimportant[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mintegrate[39m[38;5;12m [39m[38;5;12msolutions[39m[38;5;12m [39m[38;5;12mproperly.[39m
|
||
[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mThe[0m[38;5;14m[1m [0m[38;5;14m[1mreal[0m[38;5;14m[1m [0m[38;5;14m[1mreason[0m[38;5;14m[1m [0m[38;5;14m[1mmost[0m[38;5;14m[1m [0m[38;5;14m[1mML[0m[38;5;14m[1m [0m[38;5;14m[1mprojects[0m[38;5;14m[1m [0m[38;5;14m[1mfail[0m[38;5;12m [39m[38;5;12m(https://towardsdatascience.com/what-is-the-main-reason-most-ml-projects-fail-515d409a161f)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mFocus[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mclear[39m[38;5;12m [39m[38;5;12mbusiness[39m[38;5;12m [39m[38;5;12mobjectives,[39m[38;5;12m [39m[38;5;12mavoid[39m[38;5;12m [39m[38;5;12mpivots[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12malgorithms[39m[38;5;12m [39m[38;5;12munless[39m[38;5;12m [39m[38;5;12myou[39m[38;5;12m [39m[38;5;12mhave[39m[38;5;12m [39m[38;5;12mreally[39m[38;5;12m [39m[38;5;12mclean[39m[38;5;12m [39m[38;5;12mcode,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mable[39m[38;5;12m [39m
|
||
[38;5;12mto[39m[38;5;12m [39m[38;5;12mknow[39m[38;5;12m [39m[38;5;12mwhen[39m[38;5;12m [39m[38;5;12mwhat[39m[38;5;12m [39m[38;5;12myou[39m[38;5;12m [39m[38;5;12mcoded[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12m"good[39m[38;5;12m [39m[38;5;12menough".[39m
|
||
[38;5;12m - [39m[38;5;14m[1mSOLID Machine Learning[0m[38;5;12m (https://www.umaneo.com/post/the-solid-principles-applied-to-machine-learning) - The SOLID principles applied to Machine Learning.[39m
|
||
[38;5;12m [39m
|
||
|
||
|
||
[38;2;255;187;0m[4mPractical Resources[0m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mLibrairies and Implementations[0m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mNeuraxle,[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mframwework[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mmachine[0m[38;5;14m[1m [0m[38;5;14m[1mlearning[0m[38;5;14m[1m [0m[38;5;14m[1mpipelines[0m[38;5;12m [39m[38;5;12m(https://github.com/Neuraxio/Neuraxle)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mThe[39m[38;5;12m [39m[38;5;12mbest[39m[38;5;12m [39m[38;5;12mframework[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mstructuring[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mdeploying[39m[38;5;12m [39m[38;5;12myour[39m[38;5;12m [39m[38;5;12mmachine[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mprojects,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mwhich[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12malso[39m[38;5;12m [39m[38;5;12mcompatible[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mmost[39m[38;5;12m [39m[38;5;12mframework[39m[38;5;12m [39m[38;5;12m(e.g.:[39m[38;5;12m [39m
|
||
[38;5;12mScikit-Learn,[39m[38;5;12m [39m[38;5;12mTensorFlow,[39m[38;5;12m [39m[38;5;12mPyTorch,[39m[38;5;12m [39m[38;5;12mKeras,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mso[39m[38;5;12m [39m[38;5;12mforth).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mTensorFlow's GitHub repository[0m[38;5;12m (https://github.com/tensorflow/tensorflow) - Most known deep learning framework, both high-level and low-level while staying flexible.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mskflow[0m[38;5;12m (https://github.com/tensorflow/skflow) - TensorFlow wrapper à la scikit-learn.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mKeras[0m[38;5;12m (https://keras.io/) - Keras is another intersting deep learning framework like TensorFlow, it is mostly high-level.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mcarpedm20's repositories[0m[38;5;12m (https://github.com/carpedm20) - Many interesting neural network architectures are implemented by the Korean guy Taehoon Kim, A.K.A. carpedm20.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mcarpedm20/NTM-tensorflow[0m[38;5;12m (https://github.com/carpedm20/NTM-tensorflow) - Neural Turing Machine TensorFlow implementation.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep learning for lazybones[0m[38;5;12m (http://oduerr.github.io/blog/2016/04/06/Deep-Learning_for_lazybones) - Transfer learning tutorial in TensorFlow for vision from high-level embeddings of a pretrained CNN, AlexNet 2012.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mLSTM for Human Activity Recognition (HAR)[0m[38;5;12m (https://github.com/guillaume-chevalier/LSTM-Human-Activity-Recognition) - Tutorial of mine on using LSTMs on time series for classification.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep stacked residual bidirectional LSTMs for HAR[0m[38;5;12m (https://github.com/guillaume-chevalier/HAR-stacked-residual-bidir-LSTMs) - Improvements on the previous project.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSequence[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mSequence[0m[38;5;14m[1m [0m[38;5;14m[1m(seq2seq)[0m[38;5;14m[1m [0m[38;5;14m[1mRecurrent[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetwork[0m[38;5;14m[1m [0m[38;5;14m[1m(RNN)[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mTime[0m[38;5;14m[1m [0m[38;5;14m[1mSeries[0m[38;5;14m[1m [0m[38;5;14m[1mPrediction[0m[38;5;12m [39m[38;5;12m(https://github.com/guillaume-chevalier/seq2seq-signal-prediction)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mTutorial[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mmine[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mhow[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mpredict[39m[38;5;12m [39m[38;5;12mtemporal[39m[38;5;12m [39m[38;5;12msequences[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mnumbers[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m[38;5;12mmay[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m
|
||
[38;5;12mmultichannel.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mHyperopt for a Keras CNN on CIFAR-100[0m[38;5;12m (https://github.com/guillaume-chevalier/Hyperopt-Keras-CNN-CIFAR-100) - Auto (meta) optimizing a neural net (and its architecture) on the CIFAR-100 dataset.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mML / DL repositories I starred[0m[38;5;12m (https://github.com/guillaume-chevalier?direction=desc&page=1&q=machine+OR+deep+OR+learning+OR+rnn+OR+lstm+OR+cnn&sort=stars&tab=stars&utf8=%E2%9C%93) - GitHub is full of nice code samples & projects.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSmoothly[0m[38;5;14m[1m [0m[38;5;14m[1mBlend[0m[38;5;14m[1m [0m[38;5;14m[1mImage[0m[38;5;14m[1m [0m[38;5;14m[1mPatches[0m[38;5;12m [39m[38;5;12m(https://github.com/guillaume-chevalier/Smoothly-Blend-Image-Patches)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mSmooth[39m[38;5;12m [39m[38;5;12mpatch[39m[38;5;12m [39m[38;5;12mmerger[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;14m[1msemantic[0m[38;5;14m[1m [0m[38;5;14m[1msegmentation[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mU-Net[0m[38;5;12m [39m
|
||
[38;5;12m(https://vooban.com/en/tips-articles-geek-stuff/satellite-image-segmentation-workflow-with-u-net/).[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSelf[0m[38;5;14m[1m [0m[38;5;14m[1mGoverning[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1m(SGNN):[0m[38;5;14m[1m [0m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mProjection[0m[38;5;14m[1m [0m[38;5;14m[1mLayer[0m[38;5;12m [39m[38;5;12m(https://github.com/guillaume-chevalier/SGNN-Self-Governing-Neural-Networks-Projection-Layer)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mWith[39m[38;5;12m [39m[38;5;12mthis,[39m[38;5;12m [39m[38;5;12myou[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12muse[39m[38;5;12m [39m[38;5;12mwords[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12myour[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12mmodels[39m[38;5;12m [39m[38;5;12mwithout[39m[38;5;12m [39m[38;5;12mtraining[39m[38;5;12m [39m[38;5;12mnor[39m[38;5;12m [39m[38;5;12mloading[39m[38;5;12m [39m
|
||
[38;5;12membeddings.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeuraxle[0m[38;5;12m (https://github.com/Neuraxio/Neuraxle) - Neuraxle is a Machine Learning (ML) library for building neat pipelines, providing the right abstractions to both ease research, development, and deployment of your ML applications.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mClean Machine Learning, a Coding Kata[0m[38;5;12m (https://github.com/Neuraxio/Kata-Clean-Machine-Learning-From-Dirty-Code) - Learn the good design patterns to use for doing Machine Learning the good way, by practicing.[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mSome Datasets[0m
|
||
|
||
[38;5;12mThose are resources I have found that seems interesting to develop models onto.[39m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mUCI Machine Learning Repository[0m[38;5;12m (https://archive.ics.uci.edu/ml/datasets.html) - TONS of datasets for ML.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mCornell Movie--Dialogs Corpus[0m[38;5;12m (http://www.cs.cornell.edu/~cristian/Cornell_Movie-Dialogs_Corpus.html) - This could be used for a chatbot.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mSQuAD The Stanford Question Answering Dataset[0m[38;5;12m (https://rajpurkar.github.io/SQuAD-explorer/) - Question answering dataset that can be explored online, and a list of models performing well on that dataset.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mLibriSpeech ASR corpus[0m[38;5;12m (http://www.openslr.org/12/) - Huge free English speech dataset with balanced genders and speakers, that seems to be of high quality.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAwesome Public Datasets[0m[38;5;12m (https://github.com/caesar0301/awesome-public-datasets) - An awesome list of public datasets.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mSentEval: An Evaluation Toolkit for Universal Sentence Representations[0m[38;5;12m (https://arxiv.org/abs/1803.05449) - A Python framework to benchmark your sentence representations on many datasets (NLP tasks). [39m
|
||
[38;5;12m- [39m[38;5;14m[1mParlAI: A Dialog Research Software Platform[0m[38;5;12m (https://arxiv.org/abs/1705.06476) - Another Python framework to benchmark your sentence representations on many datasets (NLP tasks).[39m
|
||
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mOther Math Theory[0m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mGradient Descent Algorithms & Optimization Theory[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Networks and Deep Learning, ch.2[0m[38;5;12m (http://neuralnetworksanddeeplearning.com/chap2.html) - Overview on how does the backpropagation algorithm works.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Networks and Deep Learning, ch.4[0m[38;5;12m (http://neuralnetworksanddeeplearning.com/chap4.html) - A visual proof that neural nets can compute any function.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mYes you should understand backprop[0m[38;5;12m (https://medium.com/@karpathy/yes-you-should-understand-backprop-e2f06eab496b#.mr5wq61fb) - Exposing backprop's caveats and the importance of knowing that while training models.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mArtificial Neural Networks: Mathematics of Backpropagation[0m[38;5;12m (http://briandolhansky.com/blog/2013/9/27/artificial-neural-networks-backpropagation-part-4) - Picturing backprop, mathematically.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning Lecture 12: Recurrent Neural Nets and LSTMs[0m[38;5;12m (https://www.youtube.com/watch?v=56TYLaQN4N8) - Unfolding of RNN graphs is explained properly, and potential problems about gradient descent algorithms are exposed.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGradient descent algorithms in a saddle point[0m[38;5;12m (http://sebastianruder.com/content/images/2016/09/saddle_point_evaluation_optimizers.gif) - Visualize how different optimizers interacts with a saddle points.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGradient descent algorithms in an almost flat landscape[0m[38;5;12m (https://devblogs.nvidia.com/wp-content/uploads/2015/12/NKsFHJb.gif) - Visualize how different optimizers interacts with an almost flat landscape.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGradient Descent[0m[38;5;12m (https://www.youtube.com/watch?v=F6GSRDoB-Cg) - Okay, I already listed Andrew NG's Coursera class above, but this video especially is quite pertinent as an introduction and defines the gradient descent algorithm.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGradient Descent: Intuition[0m[38;5;12m (https://www.youtube.com/watch?v=YovTqTY-PYY) - What follows from the previous video: now add intuition.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGradient Descent in Practice 2: Learning Rate[0m[38;5;12m (https://www.youtube.com/watch?v=gX6fZHgfrow) - How to adjust the learning rate of a neural network.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mThe Problem of Overfitting[0m[38;5;12m (https://www.youtube.com/watch?v=u73PU6Qwl1I) - A good explanation of overfitting and how to address that problem.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDiagnosing Bias vs Variance[0m[38;5;12m (https://www.youtube.com/watch?v=ewogYw5oCAI) - Understanding bias and variance in the predictions of a neural net and how to address those problems.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mSelf-Normalizing Neural Networks[0m[38;5;12m (https://arxiv.org/pdf/1706.02515.pdf) - Appearance of the incredible SELU activation function.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mLearning to learn by gradient descent by gradient descent[0m[38;5;12m (https://arxiv.org/pdf/1606.04474.pdf) - RNN as an optimizer: introducing the L2L optimizer, a meta-neural network.[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mComplex Numbers & Digital Signal Processing[0m
|
||
|
||
[38;5;12mOkay, signal processing might not be directly related to deep learning, but studying it is interesting to have more intuition in developing neural architectures based on signal.[39m
|
||
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mWindow[0m[38;5;14m[1m [0m[38;5;14m[1mFunctions[0m[38;5;12m [39m[38;5;12m(https://en.wikipedia.org/wiki/Window_function)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mWikipedia[39m[38;5;12m [39m[38;5;12mpage[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m[38;5;12mlists[39m[38;5;12m [39m[38;5;12msome[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mknown[39m[38;5;12m [39m[38;5;12mwindow[39m[38;5;12m [39m[38;5;12mfunctions[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mnote[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;14m[1mHann-Poisson[0m[38;5;14m[1m [0m[38;5;14m[1mwindow[0m[38;5;12m [39m[38;5;12m(https://en.wikipedia.org/wiki/Window_function#Hann%E2%80%93Poisson_window)[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m
|
||
[38;5;12mspecially[39m[38;5;12m [39m[38;5;12minteresting[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mgreedy[39m[38;5;12m [39m[38;5;12mhill-climbing[39m[38;5;12m [39m[38;5;12malgorithms[39m[38;5;12m [39m[38;5;12m(like[39m[38;5;12m [39m[38;5;12mgradient[39m[38;5;12m [39m[38;5;12mdescent[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mexample).[39m[38;5;12m [39m
|
||
[38;5;12m- [39m[38;5;14m[1mMathBox, Tools for Thought Graphical Algebra and Fourier Analysis[0m[38;5;12m (https://acko.net/files/gltalks/toolsforthought/) - New look on Fourier analysis.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mHow to Fold a Julia Fractal[0m[38;5;12m (http://acko.net/blog/how-to-fold-a-julia-fractal/) - Animations dealing with complex numbers and wave equations.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAnimate Your Way to Glory, Math and Physics in Motion[0m[38;5;12m (http://acko.net/blog/animate-your-way-to-glory/) - Convergence methods in physic engines, and applied to interaction design.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAnimate Your Way to Glory - Part II, Math and Physics in Motion[0m[38;5;12m (http://acko.net/blog/animate-your-way-to-glory-pt2/) - Nice animations for rotation and rotation interpolation with Quaternions, a mathematical object for handling 3D rotations.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mFiltering signal, plotting the STFT and the Laplace transform[0m[38;5;12m (https://github.com/guillaume-chevalier/filtering-stft-and-laplace-transform) - Simple Python demo on signal processing.[39m
|
||
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mPapers[0m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mRecurrent Neural Networks[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning in Neural Networks: An Overview[0m[38;5;12m (https://arxiv.org/pdf/1404.7828v4.pdf) - You_Again's summary/overview of deep learning, mostly about RNNs.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mBidirectional Recurrent Neural Networks[0m[38;5;12m (http://www.di.ufpe.br/~fnj/RNA/bibliografia/BRNN.pdf) - Better classifications with RNNs with bidirectional scanning on the time axis.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mPhrase[0m[38;5;14m[1m [0m[38;5;14m[1mRepresentations[0m[38;5;14m[1m [0m[38;5;14m[1musing[0m[38;5;14m[1m [0m[38;5;14m[1mRNN[0m[38;5;14m[1m [0m[38;5;14m[1mEncoder-Decoder[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mStatistical[0m[38;5;14m[1m [0m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mTranslation[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1406.1078v3.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mTwo[39m[38;5;12m [39m[38;5;12mnetworks[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mone[39m[38;5;12m [39m[38;5;12mcombined[39m[38;5;12m [39m[38;5;12minto[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mseq2seq[39m[38;5;12m [39m[38;5;12m(sequence[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12msequence)[39m[38;5;12m [39m[38;5;12mEncoder-Decoder[39m[38;5;12m [39m[38;5;12marchitecture.[39m[38;5;12m [39m[38;5;12mRNN[39m[38;5;12m [39m
|
||
[38;5;12mEncoder–Decoder[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12m1000[39m[38;5;12m [39m[38;5;12mhidden[39m[38;5;12m [39m[38;5;12munits.[39m[38;5;12m [39m[38;5;12mAdadelta[39m[38;5;12m [39m[38;5;12moptimizer.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSequence[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mSequence[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;12m [39m[38;5;12m(http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12m4[39m[38;5;12m [39m[38;5;12mstacked[39m[38;5;12m [39m[38;5;12mLSTM[39m[38;5;12m [39m[38;5;12mcells[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12m1000[39m[38;5;12m [39m[38;5;12mhidden[39m[38;5;12m [39m[38;5;12msize[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mreversed[39m[38;5;12m [39m[38;5;12minput[39m[38;5;12m [39m[38;5;12msentences,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mbeam[39m[38;5;12m [39m[38;5;12msearch,[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m
|
||
[38;5;12mWMT’14[39m[38;5;12m [39m[38;5;12mEnglish[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mFrench[39m[38;5;12m [39m[38;5;12mdataset.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mExploring the Limits of Language Modeling[0m[38;5;12m (https://arxiv.org/pdf/1602.02410.pdf) - Nice recursive models using word-level LSTMs on top of a character-level CNN using an overkill amount of GPU power.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Machine Translation and Sequence-to-sequence Models: A Tutorial[0m[38;5;12m (https://arxiv.org/pdf/1703.01619.pdf) - Interesting overview of the subject of NMT, I mostly read part 8 about RNNs with attention as a refresher.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mExploring[0m[38;5;14m[1m [0m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mDepths[0m[38;5;14m[1m [0m[38;5;14m[1mof[0m[38;5;14m[1m [0m[38;5;14m[1mRecurrent[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mStochastic[0m[38;5;14m[1m [0m[38;5;14m[1mResidual[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;12m [39m[38;5;12m(https://cs224d.stanford.edu/reports/PradhanLongpre.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mBasically,[39m[38;5;12m [39m[38;5;12mresidual[39m[38;5;12m [39m[38;5;12mconnections[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mbetter[39m[38;5;12m [39m[38;5;12mthan[39m[38;5;12m [39m[38;5;12mstacked[39m[38;5;12m [39m[38;5;12mRNNs[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mpresented[39m[38;5;12m [39m[38;5;12mcase[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12msentiment[39m[38;5;12m [39m
|
||
[38;5;12manalysis.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mPixel Recurrent Neural Networks[0m[38;5;12m (https://arxiv.org/pdf/1601.06759.pdf) - Nice for photoshop-like "content aware fill" to fill missing patches in images.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mAdaptive[0m[38;5;14m[1m [0m[38;5;14m[1mComputation[0m[38;5;14m[1m [0m[38;5;14m[1mTime[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mRecurrent[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1603.08983v4.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mLet[39m[38;5;12m [39m[38;5;12mRNNs[39m[38;5;12m [39m[38;5;12mdecide[39m[38;5;12m [39m[38;5;12mhow[39m[38;5;12m [39m[38;5;12mlong[39m[38;5;12m [39m[38;5;12mthey[39m[38;5;12m [39m[38;5;12mcompute.[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mwould[39m[38;5;12m [39m[38;5;12mlove[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12msee[39m[38;5;12m [39m[38;5;12mhow[39m[38;5;12m [39m[38;5;12mwell[39m[38;5;12m [39m[38;5;12mwould[39m[38;5;12m [39m[38;5;12mit[39m[38;5;12m [39m[38;5;12mcombines[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mNeural[39m[38;5;12m [39m[38;5;12mTuring[39m[38;5;12m [39m[38;5;12mMachines.[39m[38;5;12m [39m[38;5;12mInteresting[39m[38;5;12m [39m[38;5;12minteractive[39m[38;5;12m [39m
|
||
[38;5;12mvisualizations[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12msubject[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mfound[39m[38;5;12m [39m[38;5;14m[1mhere[0m[38;5;12m [39m[38;5;12m(http://distill.pub/2016/augmented-rnns/).[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mConvolutional Neural Networks[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mWhat is the Best Multi-Stage Architecture for Object Recognition?[0m[38;5;12m (http://yann.lecun.com/exdb/publis/pdf/jarrett-iccv-09.pdf) - Awesome for the use of "local contrast normalization".[39m
|
||
[38;5;12m- [39m[38;5;14m[1mImageNet Classification with Deep Convolutional Neural Networks[0m[38;5;12m (http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf) - AlexNet, 2012 ILSVRC, breakthrough of the ReLU activation function.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mVisualizing and Understanding Convolutional Networks[0m[38;5;12m (https://arxiv.org/pdf/1311.2901v3.pdf) - For the "deconvnet layer".[39m
|
||
[38;5;12m- [39m[38;5;14m[1mFast and Accurate Deep Network Learning by Exponential Linear Units[0m[38;5;12m (https://arxiv.org/pdf/1511.07289v1.pdf) - ELU activation function for CIFAR vision tasks.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mVery[0m[38;5;14m[1m [0m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mConvolutional[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mLarge-Scale[0m[38;5;14m[1m [0m[38;5;14m[1mImage[0m[38;5;14m[1m [0m[38;5;14m[1mRecognition[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1409.1556v6.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mInteresting[39m[38;5;12m [39m[38;5;12midea[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mstacking[39m[38;5;12m [39m[38;5;12mmultiple[39m[38;5;12m [39m[38;5;12m3x3[39m[38;5;12m [39m[38;5;12mconv+ReLU[39m[38;5;12m [39m[38;5;12mbefore[39m[38;5;12m [39m[38;5;12mpooling[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mbigger[39m[38;5;12m [39m[38;5;12mfilter[39m[38;5;12m [39m[38;5;12msize[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mjust[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mfew[39m[38;5;12m [39m[38;5;12mparameters.[39m[38;5;12m [39m[38;5;12mThere[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12malso[39m[38;5;12m [39m[38;5;12ma[39m
|
||
[38;5;12mnice[39m[38;5;12m [39m[38;5;12mtable[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12m"ConvNet[39m[38;5;12m [39m[38;5;12mConfiguration".[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mGoing[0m[38;5;14m[1m [0m[38;5;14m[1mDeeper[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mConvolutions[0m[38;5;12m [39m[38;5;12m(http://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Szegedy_Going_Deeper_With_2015_CVPR_paper.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mGoogLeNet:[39m[38;5;12m [39m[38;5;12mAppearance[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12m"Inception"[39m[38;5;12m [39m[38;5;12mlayers/modules,[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12midea[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mparallelizing[39m[38;5;12m [39m[38;5;12mconv[39m[38;5;12m [39m[38;5;12mlayers[39m[38;5;12m [39m[38;5;12minto[39m
|
||
[38;5;12mmany[39m[38;5;12m [39m[38;5;12mmini-conv[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mdifferent[39m[38;5;12m [39m[38;5;12msize[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12m"same"[39m[38;5;12m [39m[38;5;12mpadding,[39m[38;5;12m [39m[38;5;12mconcatenated[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mdepth.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mHighway Networks[0m[38;5;12m (https://arxiv.org/pdf/1505.00387v2.pdf) - Highway networks: residual connections.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mBatch[0m[38;5;14m[1m [0m[38;5;14m[1mNormalization:[0m[38;5;14m[1m [0m[38;5;14m[1mAccelerating[0m[38;5;14m[1m [0m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mNetwork[0m[38;5;14m[1m [0m[38;5;14m[1mTraining[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mReducing[0m[38;5;14m[1m [0m[38;5;14m[1mInternal[0m[38;5;14m[1m [0m[38;5;14m[1mCovariate[0m[38;5;14m[1m [0m[38;5;14m[1mShift[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1502.03167v3.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mBatch[39m[38;5;12m [39m[38;5;12mnormalization[39m[38;5;12m [39m[38;5;12m(BN):[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mnormalize[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mlayer's[39m[38;5;12m [39m[38;5;12moutput[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12malso[39m[38;5;12m [39m[38;5;12msumming[39m[38;5;12m [39m[38;5;12mover[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mentire[39m[38;5;12m [39m[38;5;12mbatch,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mthen[39m[38;5;12m [39m
|
||
[38;5;12mperforming[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mlinear[39m[38;5;12m [39m[38;5;12mrescaling[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mshifting[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mcertain[39m[38;5;12m [39m[38;5;12mtrainable[39m[38;5;12m [39m[38;5;12mamount.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mU-Net: Convolutional Networks for Biomedical Image Segmentation[0m[38;5;12m (https://arxiv.org/pdf/1505.04597.pdf) - The U-Net is an encoder-decoder CNN that also has skip-connections, good for image segmentation at a per-pixel level.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mResidual[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mImage[0m[38;5;14m[1m [0m[38;5;14m[1mRecognition[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1512.03385v1.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mVery[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mresidual[39m[38;5;12m [39m[38;5;12mlayers[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mbatch[39m[38;5;12m [39m[38;5;12mnormalization[39m[38;5;12m [39m[38;5;12mlayers[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12ma.k.a.[39m[38;5;12m [39m[38;5;12m"how[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12moverfit[39m[38;5;12m [39m[38;5;12many[39m[38;5;12m [39m[38;5;12mvision[39m[38;5;12m [39m[38;5;12mdataset[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mtoo[39m[38;5;12m [39m[38;5;12mmany[39m[38;5;12m [39m[38;5;12mlayers[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mmake[39m[38;5;12m [39m[38;5;12many[39m[38;5;12m [39m[38;5;12mvision[39m[38;5;12m [39m[38;5;12mmodel[39m[38;5;12m [39m[38;5;12mwork[39m[38;5;12m [39m
|
||
[38;5;12mproperly[39m[38;5;12m [39m[38;5;12mat[39m[38;5;12m [39m[38;5;12mrecognition[39m[38;5;12m [39m[38;5;12mgiven[39m[38;5;12m [39m[38;5;12menough[39m[38;5;12m [39m[38;5;12mdata".[39m
|
||
[38;5;12m- [39m[38;5;14m[1mInception-v4, Inception-ResNet and the Impact of Residual Connections on Learning[0m[38;5;12m (https://arxiv.org/pdf/1602.07261v2.pdf) - For improving GoogLeNet with residual connections.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mWaveNet: a Generative Model for Raw Audio[0m[38;5;12m (https://arxiv.org/pdf/1609.03499v2.pdf) - Epic raw voice/music generation with new architectures based on dilated causal convolutions to capture more audio length.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mProbabilistic[0m[38;5;14m[1m [0m[38;5;14m[1mLatent[0m[38;5;14m[1m [0m[38;5;14m[1mSpace[0m[38;5;14m[1m [0m[38;5;14m[1mof[0m[38;5;14m[1m [0m[38;5;14m[1mObject[0m[38;5;14m[1m [0m[38;5;14m[1mShapes[0m[38;5;14m[1m [0m[38;5;14m[1mvia[0m[38;5;14m[1m [0m[38;5;14m[1m3D[0m[38;5;14m[1m [0m[38;5;14m[1mGenerative-Adversarial[0m[38;5;14m[1m [0m[38;5;14m[1mModeling[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1610.07584v2.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12m3D-GANs[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12m3D[39m[38;5;12m [39m[38;5;12mmodel[39m[38;5;12m [39m[38;5;12mgeneration[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mfun[39m[38;5;12m [39m[38;5;12m3D[39m[38;5;12m [39m[38;5;12mfurniture[39m[38;5;12m [39m[38;5;12marithmetics[39m[38;5;12m [39m[38;5;12mfrom[39m[38;5;12m [39m[38;5;12membeddings[39m[38;5;12m [39m[38;5;12m(think[39m[38;5;12m [39m[38;5;12mlike[39m[38;5;12m [39m[38;5;12mword2vec[39m[38;5;12m [39m[38;5;12mword[39m[38;5;12m [39m
|
||
[38;5;12marithmetics[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12m3D[39m[38;5;12m [39m[38;5;12mfurniture[39m[38;5;12m [39m[38;5;12mrepresentations).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAccurate, Large Minibatch SGD: Training ImageNet in 1 Hour[0m[38;5;12m (https://research.fb.com/publications/ImageNet1kIn1h/) - Incredibly fast distributed training of a CNN.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mDensely[0m[38;5;14m[1m [0m[38;5;14m[1mConnected[0m[38;5;14m[1m [0m[38;5;14m[1mConvolutional[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1608.06993.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mBest[39m[38;5;12m [39m[38;5;12mPaper[39m[38;5;12m [39m[38;5;12mAward[39m[38;5;12m [39m[38;5;12mat[39m[38;5;12m [39m[38;5;12mCVPR[39m[38;5;12m [39m[38;5;12m2017,[39m[38;5;12m [39m[38;5;12myielding[39m[38;5;12m [39m[38;5;12mimprovements[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mstate-of-the-art[39m[38;5;12m [39m[38;5;12mperformances[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mCIFAR-10,[39m[38;5;12m [39m[38;5;12mCIFAR-100[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mSVHN[39m[38;5;12m [39m[38;5;12mdatasets,[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mnew[39m[38;5;12m [39m[38;5;12mneural[39m[38;5;12m [39m[38;5;12mnetwork[39m[38;5;12m [39m
|
||
[38;5;12marchitecture[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mnamed[39m[38;5;12m [39m[38;5;12mDenseNet.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mThe[0m[38;5;14m[1m [0m[38;5;14m[1mOne[0m[38;5;14m[1m [0m[38;5;14m[1mHundred[0m[38;5;14m[1m [0m[38;5;14m[1mLayers[0m[38;5;14m[1m [0m[38;5;14m[1mTiramisu:[0m[38;5;14m[1m [0m[38;5;14m[1mFully[0m[38;5;14m[1m [0m[38;5;14m[1mConvolutional[0m[38;5;14m[1m [0m[38;5;14m[1mDenseNets[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mSemantic[0m[38;5;14m[1m [0m[38;5;14m[1mSegmentation[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1611.09326.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mMerges[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mideas[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mU-Net[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mDenseNet,[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mnew[39m[38;5;12m [39m[38;5;12mneural[39m[38;5;12m [39m[38;5;12mnetwork[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mespecially[39m[38;5;12m [39m[38;5;12mgood[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mhuge[39m[38;5;12m [39m[38;5;12mdatasets[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m
|
||
[38;5;12mimage[39m[38;5;12m [39m[38;5;12msegmentation.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mPrototypical Networks for Few-shot Learning[0m[38;5;12m (https://arxiv.org/pdf/1703.05175.pdf) - Use a distance metric in the loss to determine to which class does an object belongs to from a few examples.[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mAttention Mechanisms[0m
|
||
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mTranslation[0m[38;5;14m[1m [0m[38;5;14m[1mby[0m[38;5;14m[1m [0m[38;5;14m[1mJointly[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mAlign[0m[38;5;14m[1m [0m[38;5;14m[1mand[0m[38;5;14m[1m [0m[38;5;14m[1mTranslate[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1409.0473.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mAttention[39m[38;5;12m [39m[38;5;12mmechanism[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mLSTMs![39m[38;5;12m [39m[38;5;12mMostly,[39m[38;5;12m [39m[38;5;12mfigures[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mformulas[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mtheir[39m[38;5;12m [39m[38;5;12mexplanations[39m[38;5;12m [39m[38;5;12mrevealed[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12museful[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mme.[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mgave[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mtalk[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m
|
||
[38;5;12mpaper[39m[38;5;12m [39m[38;5;14m[1mhere[0m[38;5;12m [39m[38;5;12m(https://www.youtube.com/watch?v=QuvRWevJMZ4).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNeural Turing Machines[0m[38;5;12m (https://arxiv.org/pdf/1410.5401v2.pdf) - Outstanding for letting a neural network learn an algorithm with seemingly good generalization over long time dependencies. Sequences recall problem.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mShow, Attend and Tell: Neural Image Caption Generation with Visual Attention[0m[38;5;12m (https://arxiv.org/pdf/1502.03044.pdf) - LSTMs' attention mechanisms on CNNs feature maps does wonders.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mTeaching Machines to Read and Comprehend[0m[38;5;12m (https://arxiv.org/pdf/1506.03340v3.pdf) - A very interesting and creative work about textual question answering, what a breakthrough, there is something to do with that.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mEffective Approaches to Attention-based Neural Machine Translation[0m[38;5;12m (https://arxiv.org/pdf/1508.04025.pdf) - Exploring different approaches to attention mechanisms.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mMatching Networks for One Shot Learning[0m[38;5;12m (https://arxiv.org/pdf/1606.04080.pdf) - Interesting way of doing one-shot learning with low-data by using an attention mechanism and a query to compare an image to other images for classification.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mGoogle’s[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mTranslation[0m[38;5;14m[1m [0m[38;5;14m[1mSystem:[0m[38;5;14m[1m [0m[38;5;14m[1mBridging[0m[38;5;14m[1m [0m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mGap[0m[38;5;14m[1m [0m[38;5;14m[1mbetween[0m[38;5;14m[1m [0m[38;5;14m[1mHuman[0m[38;5;14m[1m [0m[38;5;14m[1mand[0m[38;5;14m[1m [0m[38;5;14m[1mMachine[0m[38;5;14m[1m [0m[38;5;14m[1mTranslation[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/pdf/1609.08144.pdf)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mIn[39m[38;5;12m [39m[38;5;12m2016:[39m[38;5;12m [39m[38;5;12mstacked[39m[38;5;12m [39m[38;5;12mresidual[39m[38;5;12m [39m[38;5;12mLSTMs[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mattention[39m[38;5;12m [39m[38;5;12mmechanisms[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mencoder/decoder[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mbest[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mNMT[39m[38;5;12m [39m
|
||
[38;5;12m(Neural[39m[38;5;12m [39m[38;5;12mMachine[39m[38;5;12m [39m[38;5;12mTranslation).[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mHybrid[0m[38;5;14m[1m [0m[38;5;14m[1mcomputing[0m[38;5;14m[1m [0m[38;5;14m[1musing[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mneural[0m[38;5;14m[1m [0m[38;5;14m[1mnetwork[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mdynamic[0m[38;5;14m[1m [0m[38;5;14m[1mexternal[0m[38;5;14m[1m [0m[38;5;14m[1mmemory[0m[38;5;12m [39m
|
||
[38;5;12m(http://www.nature.com/articles/nature20101.epdf?author_access_token=ImTXBI8aWbYxYQ51Plys8NRgN0jAjWel9jnR3ZoTv0MggmpDmwljGswxVdeocYSurJ3hxupzWuRNeGvvXnoO8o4jTJcnAyhGuZzXJ1GEaD-Z7E6X_a9R-xqJ9TfJWBqz)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mImprovements[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mdifferentiable[39m[38;5;12m [39m[38;5;12mmemory[39m[38;5;12m [39m[38;5;12mbased[39m[38;5;12m [39m
|
||
[38;5;12mon[39m[38;5;12m [39m[38;5;12mNTMs:[39m[38;5;12m [39m[38;5;12mnow[39m[38;5;12m [39m[38;5;12mit[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mDifferentiable[39m[38;5;12m [39m[38;5;12mNeural[39m[38;5;12m [39m[38;5;12mComputer[39m[38;5;12m [39m[38;5;12m(DNC).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mMassive Exploration of Neural Machine Translation Architectures[0m[38;5;12m (https://arxiv.org/pdf/1703.03906.pdf) - That yields intuition about the boundaries of what works for doing NMT within a framed seq2seq problem formulation.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mNatural TTS Synthesis by Conditioning WaveNet on Mel Spectrogram[0m
|
||
[38;5;12mPredictions[39m[38;5;14m[1m [0m[38;5;14m[1m(https://arxiv.org/pdf/1712.05884.pdf)[0m[38;5;14m[1m [0m[38;5;14m[1m-[0m[38;5;14m[1m [0m[38;5;14m[1mA[0m[38;5;14m[1m [0m[38;5;12mWaveNet[39m[38;5;14m[1m [0m[38;5;14m[1m(https://arxiv.org/pdf/1609.03499v2.pdf)[0m[38;5;14m[1m [0m[38;5;14m[1mused[0m[38;5;14m[1m [0m[38;5;14m[1mas[0m[38;5;14m[1m [0m[38;5;14m[1ma[0m[38;5;14m[1m [0m[38;5;14m[1mvocoder[0m[38;5;14m[1m [0m[38;5;14m[1mcan[0m[38;5;14m[1m [0m[38;5;14m[1mbe[0m[38;5;14m[1m [0m[38;5;14m[1mconditioned[0m[38;5;14m[1m [0m[38;5;14m[1mon[0m[38;5;14m[1m [0m[38;5;14m[1mgenerated[0m[38;5;14m[1m [0m[38;5;14m[1mMel[0m[38;5;14m[1m [0m[38;5;14m[1mSpectrograms[0m[38;5;14m[1m [0m[38;5;14m[1mfrom[0m[38;5;14m[1m [0m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mTacotron[0m[38;5;14m[1m [0m[38;5;14m[1m2[0m[38;5;14m[1m [0m[38;5;14m[1mLSTM[0m[38;5;14m[1m [0m[38;5;14m[1mneural[0m[38;5;14m[1m [0m[38;5;14m[1mnetwork[0m[38;5;14m[1m [0m[38;5;14m[1mwith[0m[38;5;14m[1m [0m[38;5;14m[1mattention[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mgenerate[0m[38;5;14m[1m [0m[38;5;14m[1mneat[0m[38;5;14m[1m [0m
|
||
[38;5;14m[1maudio[0m[38;5;14m[1m [0m[38;5;14m[1mfrom[0m[38;5;14m[1m [0m[38;5;14m[1mtext.[0m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mAttention[0m[38;5;14m[1m [0m[38;5;14m[1mIs[0m[38;5;14m[1m [0m[38;5;14m[1mAll[0m[38;5;14m[1m [0m[38;5;14m[1mYou[0m[38;5;14m[1m [0m[38;5;14m[1mNeed[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/abs/1706.03762)[39m[38;5;12m [39m[38;5;12m(AIAYN)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mIntroducing[39m[38;5;12m [39m[38;5;12mmulti-head[39m[38;5;12m [39m[38;5;12mself-attention[39m[38;5;12m [39m[38;5;12mneural[39m[38;5;12m [39m[38;5;12mnetworks[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mpositional[39m[38;5;12m [39m[38;5;12mencoding[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mdo[39m[38;5;12m [39m[38;5;12msentence-level[39m[38;5;12m [39m[38;5;12mNLP[39m[38;5;12m [39m[38;5;12mwithout[39m[38;5;12m [39m[38;5;12many[39m[38;5;12m [39m[38;5;12mRNN[39m[38;5;12m [39m[38;5;12mnor[39m[38;5;12m [39m[38;5;12mCNN[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mthis[39m[38;5;12m [39m[38;5;12mpaper[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mmust-read[39m[38;5;12m [39m[38;5;12m(also[39m[38;5;12m [39m[38;5;12msee[39m[38;5;12m [39m[38;5;14m[1mthis[0m[38;5;14m[1m [0m
|
||
[38;5;14m[1mexplanation[0m[38;5;12m [39m[38;5;12m(http://nlp.seas.harvard.edu/2018/04/03/attention.html)[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;14m[1mthis[0m[38;5;14m[1m [0m[38;5;14m[1mvisualization[0m[38;5;12m [39m[38;5;12m(http://jalammar.github.io/illustrated-transformer/)[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mpaper).[39m[38;5;12m [39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mOther[0m
|
||
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mProjectionNet:[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;14m[1m [0m[38;5;14m[1mEfficient[0m[38;5;14m[1m [0m[38;5;14m[1mOn-Device[0m[38;5;14m[1m [0m[38;5;14m[1mDeep[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1mUsing[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mProjections[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/abs/1708.00630)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mReplace[39m[38;5;12m [39m[38;5;12mword[39m[38;5;12m [39m[38;5;12membeddings[39m[38;5;12m [39m[38;5;12mby[39m[38;5;12m [39m[38;5;12mword[39m[38;5;12m [39m[38;5;12mprojections[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12myour[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mneural[39m[38;5;12m [39m[38;5;12mnetworks,[39m[38;5;12m [39m[38;5;12mwhich[39m[38;5;12m [39m[38;5;12mdoesn't[39m[38;5;12m [39m[38;5;12mrequire[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mpre-extracted[39m[38;5;12m [39m[38;5;12mdictionnary[39m[38;5;12m [39m
|
||
[38;5;12mnor[39m[38;5;12m [39m[38;5;12mstoring[39m[38;5;12m [39m[38;5;12membedding[39m[38;5;12m [39m[38;5;12mmatrices.[39m[38;5;12m [39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSelf-Governing[0m[38;5;14m[1m [0m[38;5;14m[1mNeural[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mOn-Device[0m[38;5;14m[1m [0m[38;5;14m[1mShort[0m[38;5;14m[1m [0m[38;5;14m[1mText[0m[38;5;14m[1m [0m[38;5;14m[1mClassification[0m[38;5;12m [39m[38;5;12m(http://aclweb.org/anthology/D18-1105)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mThis[39m[38;5;12m [39m[38;5;12mpaper[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12msequel[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mProjectionNet[39m[38;5;12m [39m[38;5;12mjust[39m[38;5;12m [39m[38;5;12mabove.[39m[38;5;12m [39m[38;5;12mThe[39m[38;5;12m [39m[38;5;12mSGNN[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12melaborated[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mProjectionNet,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12moptimizations[39m[38;5;12m [39m[38;5;12mare[39m[38;5;12m [39m
|
||
[38;5;12mdetailed[39m[38;5;12m [39m[38;5;12mmore[39m[38;5;12m [39m[38;5;12min-depth[39m[38;5;12m [39m[38;5;12m(also[39m[38;5;12m [39m[38;5;12msee[39m[38;5;12m [39m[38;5;12mmy[39m[38;5;12m [39m[38;5;14m[1mattempt[0m[38;5;14m[1m [0m[38;5;14m[1mto[0m[38;5;14m[1m [0m[38;5;14m[1mreproduce[0m[38;5;14m[1m [0m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mpaper[0m[38;5;14m[1m [0m[38;5;14m[1min[0m[38;5;14m[1m [0m[38;5;14m[1mcode[0m[38;5;12m [39m[38;5;12m(https://github.com/guillaume-chevalier/SGNN-Self-Governing-Neural-Networks-Projection-Layer)[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mwatch[39m[38;5;12m [39m[38;5;14m[1mthe[0m[38;5;14m[1m [0m[38;5;14m[1mtalks'[0m[38;5;14m[1m [0m[38;5;14m[1mrecording[0m[38;5;12m [39m[38;5;12m(https://vimeo.com/305197775)).[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mMatching[0m[38;5;14m[1m [0m[38;5;14m[1mNetworks[0m[38;5;14m[1m [0m[38;5;14m[1mfor[0m[38;5;14m[1m [0m[38;5;14m[1mOne[0m[38;5;14m[1m [0m[38;5;14m[1mShot[0m[38;5;14m[1m [0m[38;5;14m[1mLearning[0m[38;5;12m [39m[38;5;12m(https://arxiv.org/abs/1606.04080)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mClassify[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mnew[39m[38;5;12m [39m[38;5;12mexample[39m[38;5;12m [39m[38;5;12mfrom[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mlist[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mother[39m[38;5;12m [39m[38;5;12mexamples[39m[38;5;12m [39m[38;5;12m(without[39m[38;5;12m [39m[38;5;12mdefinitive[39m[38;5;12m [39m[38;5;12mcategories)[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mlow-data[39m[38;5;12m [39m[38;5;12mper[39m[38;5;12m [39m[38;5;12mclassification[39m[38;5;12m [39m[38;5;12mtask,[39m[38;5;12m [39m[38;5;12mbut[39m[38;5;12m [39m[38;5;12mlots[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12mdata[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12mlots[39m[38;5;12m [39m[38;5;12mof[39m[38;5;12m [39m[38;5;12msimilar[39m[38;5;12m [39m
|
||
[38;5;12mclassification[39m[38;5;12m [39m[38;5;12mtasks[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mit[39m[38;5;12m [39m[38;5;12mseems[39m[38;5;12m [39m[38;5;12mbetter[39m[38;5;12m [39m[38;5;12mthan[39m[38;5;12m [39m[38;5;12msiamese[39m[38;5;12m [39m[38;5;12mnetworks.[39m[38;5;12m [39m[38;5;12mTo[39m[38;5;12m [39m[38;5;12msum[39m[38;5;12m [39m[38;5;12mup:[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mMatching[39m[38;5;12m [39m[38;5;12mNetworks,[39m[38;5;12m [39m[38;5;12myou[39m[38;5;12m [39m[38;5;12mcan[39m[38;5;12m [39m[38;5;12moptimize[39m[38;5;12m [39m[38;5;12mdirectly[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mcosine[39m[38;5;12m [39m[38;5;12msimilarity[39m[38;5;12m [39m[38;5;12mbetween[39m[38;5;12m [39m[38;5;12mexamples[39m[38;5;12m [39m[38;5;12m(like[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mself-attention[39m[38;5;12m [39m[38;5;12mproduct[39m[38;5;12m [39m[38;5;12mwould[39m[38;5;12m [39m[38;5;12mmatch)[39m[38;5;12m [39m[38;5;12mwhich[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12mpassed[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12msoftmax[39m[38;5;12m [39m
|
||
[38;5;12mdirectly.[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mguess[39m[38;5;12m [39m[38;5;12mthat[39m[38;5;12m [39m[38;5;12mMatching[39m[38;5;12m [39m[38;5;12mNetworks[39m[38;5;12m [39m[38;5;12mcould[39m[38;5;12m [39m[38;5;12mprobably[39m[38;5;12m [39m[38;5;12mbe[39m[38;5;12m [39m[38;5;12mused[39m[38;5;12m [39m[38;5;12mas[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mnegative-sampling[39m[38;5;12m [39m[38;5;12msoftmax[39m[38;5;12m [39m[38;5;12mtraining[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mword2vec's[39m[38;5;12m [39m[38;5;12mCBOW[39m[38;5;12m [39m[38;5;12mor[39m[38;5;12m [39m[38;5;12mSkip-gram[39m[38;5;12m [39m[38;5;12mwithout[39m[38;5;12m [39m[38;5;12mhaving[39m[38;5;12m [39m[38;5;12mto[39m[38;5;12m [39m[38;5;12mdo[39m[38;5;12m [39m[38;5;12many[39m[38;5;12m [39m[38;5;12mcontext[39m[38;5;12m [39m[38;5;12membedding[39m[38;5;12m [39m[38;5;12mlookups.[39m[38;5;12m [39m
|
||
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mYouTube and Videos[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mAttention Mechanisms in Recurrent Neural Networks (RNNs) - IGGG[0m[38;5;12m (https://www.youtube.com/watch?v=QuvRWevJMZ4) - A talk for a reading group on attention mechanisms (Paper: Neural Machine Translation by Jointly Learning to Align and Translate).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mTensor Calculus and the Calculus of Moving Surfaces[0m[38;5;12m (https://www.youtube.com/playlist?list=PLlXfTHzgMRULkodlIEqfgTS-H1AY_bNtq) - Generalize properly how Tensors work, yet just watching a few videos already helps a lot to grasp the concepts.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDeep Learning & Machine Learning (Advanced topics)[0m[38;5;12m (https://www.youtube.com/playlist?list=PLlp-GWNOd6m4C_-9HxuHg2_ZeI2Yzwwqt) - A list of videos about deep learning that I found interesting or useful, this is a mix of a bit of everything.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mSignal[0m[38;5;14m[1m [0m[38;5;14m[1mProcessing[0m[38;5;14m[1m [0m[38;5;14m[1mPlaylist[0m[38;5;12m [39m[38;5;12m(https://www.youtube.com/playlist?list=PLlp-GWNOd6m6gSz0wIcpvl4ixSlS-HEmr)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mA[39m[38;5;12m [39m[38;5;12mYouTube[39m[38;5;12m [39m[38;5;12mplaylist[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mcomposed[39m[38;5;12m [39m[38;5;12mabout[39m[38;5;12m [39m[38;5;12mDFT/FFT,[39m[38;5;12m [39m[38;5;12mSTFT[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mLaplace[39m[38;5;12m [39m[38;5;12mtransform[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mI[39m[38;5;12m [39m[38;5;12mwas[39m[38;5;12m [39m[38;5;12mmad[39m[38;5;12m [39m[38;5;12mabout[39m[38;5;12m [39m[38;5;12mmy[39m[38;5;12m [39m[38;5;12msoftware[39m[38;5;12m [39m[38;5;12mengineering[39m[38;5;12m [39m[38;5;12mbachelor[39m[38;5;12m [39m[38;5;12mnot[39m[38;5;12m [39m[38;5;12mincluding[39m
|
||
[38;5;12msignal[39m[38;5;12m [39m[38;5;12mprocessing[39m[38;5;12m [39m[38;5;12mclasses[39m[38;5;12m [39m[38;5;12m(except[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mbit[39m[38;5;12m [39m[38;5;12min[39m[38;5;12m [39m[38;5;12mthe[39m[38;5;12m [39m[38;5;12mquantum[39m[38;5;12m [39m[38;5;12mphysics[39m[38;5;12m [39m[38;5;12mclass).[39m
|
||
[38;5;12m- [39m[38;5;14m[1mComputer Science[0m[38;5;12m (https://www.youtube.com/playlist?list=PLlp-GWNOd6m7vLOsW20xAJ81-65C-Ys6k) - Yet another YouTube playlist I composed, this time about various CS topics.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mSiraj's Channel[0m[38;5;12m (https://www.youtube.com/channel/UCWN3xxRkmTPmbKwht9FuE5A/videos?view=0&sort=p&flow=grid) - Siraj has entertaining, fast-paced video tutorials about deep learning.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mTwo Minute Papers' Channel[0m[38;5;12m (https://www.youtube.com/user/keeroyz/videos?sort=p&view=0&flow=grid) - Interesting and shallow overview of some research papers, for example about WaveNet or Neural Style Transfer.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mGeoffrey[0m[38;5;14m[1m [0m[38;5;14m[1mHinton[0m[38;5;14m[1m [0m[38;5;14m[1minterview[0m[38;5;12m [39m[38;5;12m(https://www.coursera.org/learn/neural-networks-deep-learning/lecture/dcm5r/geoffrey-hinton-interview)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mAndrew[39m[38;5;12m [39m[38;5;12mNg[39m[38;5;12m [39m[38;5;12minterviews[39m[38;5;12m [39m[38;5;12mGeoffrey[39m[38;5;12m [39m[38;5;12mHinton,[39m[38;5;12m [39m[38;5;12mwho[39m[38;5;12m [39m[38;5;12mtalks[39m[38;5;12m [39m[38;5;12mabout[39m[38;5;12m [39m[38;5;12mhis[39m[38;5;12m [39m[38;5;12mresearch[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mbreaktroughs,[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mgives[39m[38;5;12m [39m[38;5;12madvice[39m[38;5;12m [39m[38;5;12mfor[39m[38;5;12m [39m
|
||
[38;5;12mstudents.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mGrowing Neat Software Architecture from Jupyter Notebooks[0m[38;5;12m (https://www.youtube.com/watch?v=K4QN27IKr0g) - A primer on how to structure your Machine Learning projects when using Jupyter Notebooks.[39m
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mMisc. Hubs & Links[0m
|
||
|
||
[38;5;12m- [39m[38;5;14m[1mHacker News[0m[38;5;12m (https://news.ycombinator.com/news) - Maybe how I discovered ML - Interesting trends appear on that site way before they get to be a big deal.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mDataTau[0m[38;5;12m (http://www.datatau.com/) - This is a hub similar to Hacker News, but specific to data science.[39m
|
||
[38;5;12m-[39m[38;5;12m [39m[38;5;14m[1mNaver[0m[38;5;12m [39m[38;5;12m(http://www.naver.com/)[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mThis[39m[38;5;12m [39m[38;5;12mis[39m[38;5;12m [39m[38;5;12ma[39m[38;5;12m [39m[38;5;12mKorean[39m[38;5;12m [39m[38;5;12msearch[39m[38;5;12m [39m[38;5;12mengine[39m[38;5;12m [39m[38;5;12m-[39m[38;5;12m [39m[38;5;12mbest[39m[38;5;12m [39m[38;5;12mused[39m[38;5;12m [39m[38;5;12mwith[39m[38;5;12m [39m[38;5;12mGoogle[39m[38;5;12m [39m[38;5;12mTranslate,[39m[38;5;12m [39m[38;5;12mironically.[39m[38;5;12m [39m[38;5;12mSurprisingly,[39m[38;5;12m [39m[38;5;12msometimes[39m[38;5;12m [39m[38;5;12mdeep[39m[38;5;12m [39m[38;5;12mlearning[39m[38;5;12m [39m[38;5;12msearch[39m[38;5;12m [39m[38;5;12mresults[39m[38;5;12m [39m[38;5;12mand[39m[38;5;12m [39m[38;5;12mcomprehensible[39m[38;5;12m [39m[38;5;12madvanced[39m[38;5;12m [39m[38;5;12mmath[39m[38;5;12m [39m[38;5;12mcontent[39m[38;5;12m [39m[38;5;12mshows[39m[38;5;12m [39m[38;5;12mup[39m[38;5;12m [39m[38;5;12mmore[39m[38;5;12m [39m[38;5;12measily[39m[38;5;12m [39m[38;5;12mthere[39m[38;5;12m [39m[38;5;12mthan[39m[38;5;12m [39m[38;5;12mon[39m[38;5;12m [39m
|
||
[38;5;12mGoogle[39m[38;5;12m [39m[38;5;12msearch.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mArxiv Sanity Preserver[0m[38;5;12m (http://www.arxiv-sanity.com/) - arXiv browser with TF/IDF features.[39m
|
||
[38;5;12m- [39m[38;5;14m[1mAwesome Neuraxle[0m[38;5;12m (https://github.com/Neuraxio/Awesome-Neuraxle) - An awesome list for Neuraxle, a ML Framework for coding clean production-level ML pipelines.[39m
|
||
|
||
|
||
|
||
|
||
[38;2;255;187;0m[4mLicense[0m
|
||
|
||
[38;5;14m[1m![0m[38;5;12mCC0[39m[38;5;14m[1m (http://mirrors.creativecommons.org/presskit/buttons/88x31/svg/cc-zero.svg)[0m[38;5;12m (https://creativecommons.org/publicdomain/zero/1.0/)[39m
|
||
|
||
[38;5;12mTo the extent possible under law, [39m[38;5;14m[1mGuillaume Chevalier[0m[38;5;12m (https://github.com/guillaume-chevalier) has waived all copyright and related or neighboring rights to this work.[39m
|
||
|
||
[38;5;12mdeeplearningresources Github: https://github.com/guillaume-chevalier/awesome-deep-learning-resources[39m
|