博后招募 | 伦敦大学学院倪好教授招收机器学习/计算机视觉方向博士后
合适的工作难找?最新的招聘信息也不知道?
AI 求职为大家精选人工智能领域最新鲜的招聘信息,助你先人一步投递,快人一步入职!
伦敦大学学院
About us
Applications are invited for a full-time Research Fellow position to work with Dr Hao Ni on the project Human Computer Interfaces. The position is funded by the EPSRC as part of the Programme Grant Unparameterised multi-modal data, high order signatures, and the mathematics of data science (DATASIG) and will be part of a multi-university research group (Imperial College London, University College London, and the University of Oxford) with a hub at The Alan Turing Institute. This position will be based at University College London and The Alan Turing Institute, British Library, and the successful applicant will be embedded beside the software engineering team at The Alan Turing Institute.
Through a collaboration between leading mathematicians and leading domain scientists, the Programme will create a powerful and generic set of mathematical and computational tools for the analysis of complex multimodal data streams and to establish their effective use in four applied challenges (ACs): Mental Health, Radio Astronomy, Human Machine Interfaces and Computer Vision. Each area will be supported by at least one Research Fellow. Other Research Fellows will focus on the underlying mathematics.
The successful applicant will work closely with Dr Hao Ni (Co-I Maths, UCL), and the Project Partner Lianwen Jin from South China University of Technology (SCUT) focusing on core challenges in Human Computer Interfaces. Smart devices should be intelligent interpreters that collect requests, digest information and return valuable responses, eventually transforming how people live and work. The human computer interaction (HCI) experience with these devices will be critical in determining the nature of that interaction. We focus on three kinds of intelligent HCI technologies for such devices, namely, handwriting-based HCI, accelerometer-based HCI and egocentric gesture-based HCI. These three kinds of HCI share the common characteristic that they are all based on interpreting multidimensional sequential data accurately in real time with a large (or huge) number of outcomes.
Candidates should either have a PhD or equivalent qualification in mathematics, or a field related to the programme, or should be finalising their PhD or waiting for their viva date. The successful candidate will have research experience in in rough path theory and related areas, clear evidence of outstanding promise and originality in research, with a good publication record, commensurate with career stage. Ability to identify, develop and apply concepts, techniques and methods in new contexts, keep accurate records of research results and activity, help with reporting. As well as the ability to exercise initiative and judgement in carrying out research tasks, conduct a detailed review of recent literature, and organise own work independently and prioritise in response to deadlines. They will demonstrate willingness and ability to work effectively with a team of researchers and across disciplines to contribute towards the programme’s aims and objectives by actively engaging with the broader programme team, and have excellent communication skills, both written and oral.
Please note: Appointment at Grade 7 is dependent on award and confirmation of a PhD (or equivalent). If this is not the case, initial appointment will be at Research Assistant Grade 6B, point 26 (salary £34,976 per annum inclusive of London allowance) with payment at Grade 7 being backdated to the date of final submission of the PhD thesis.
- 41 Days holiday (27 days annual leave 8 bank holiday and 6 closure days)
- Additional 5 days’ annual leave purchase scheme
- Defined benefit career average revalued earnings pension scheme (CARE)
- Cycle to work scheme and season ticket loan
- Immigration loan
- Relocation scheme for certain posts
- On-Site nursery
- On-site gym
- Enhanced maternity, paternity and adoption pay
- Employee assistance programme: Staff Support Service
- Discounted medical insurance
Visit https://www.ucl.ac.uk/work-at-ucl/reward-and-benefits to find out more.
https://www.ucl.ac.uk/work-at-ucl/search-ucl-jobs/details?jobId=823&jobTitle=Research%20Fellow
实习内推
索尼中国研究院|阿里巴巴国际化广告技术团队|OPPO小布助手|百度视觉技术团队|北京脑科学与类脑研究中心合作实验室|Hulu机器学习应用平台团队|粤港澳大湾区数字经济研究院|交叉科技|浪潮集团|上海期智研究院|启元世界|商汤研究院基础视觉组|小红书|未来机器人|微软亚洲研究院机器学习组|粤港澳大湾区数字经济研究院|香港量子人工智能实验室|联汇科技|粤港澳大湾区数字经济研究院|百图生科|阿里Lazada广告技术团队|遇见森林|同花顺问财集群|快手社科线策略算法部|百度视觉技术部|蔚来汽车|Sony AI隐私保护机器学习部|腾讯云小微自然语言技术中心|微软研究院科学智能中心|百度研究院|百度知识图谱部|广东智慧教育研究院|非凸科技|北京智源人工智能研究院
高校招生
上海财经大学语言智能实验室|香港中文大学(深圳)|乔治梅森大学|香港中文大学(深圳)查宏远教授|麦吉尔大学智能自动化实验室|上海交通大学叶南阳老师|南洋理工大学Lu Shijian教授|清华大学周伯文老师|香港科技大学陈浩老师|香港理工大学智能计算实验室|香港浸会大学雪巍老师|香港中文大学岳翔宇老师|北卡州立大学郭志山教授|香港科技大学(广州)王泽宇老师|中弗罗里达大学姚凡老师|浙江大学廖备水教授|达尔豪斯大学计算机系|清华大学弋力老师课题组|香港科技大学统计机器学习实验室|印第安纳大学姜雷教授|北卡州立大学胥栋宽老师|奥克兰大学工业人工智能课题组|清华大学智能计算实验室|佛罗里达州立大学王广老师|香港城市大学陆志聪老师|香港浸会大学杨任驰老师|中佛罗里达大学娄钱老师|圣路易斯华盛顿大学|北京大学分子影像实验室|空军军医大学徐肖攀老师|新加坡国立大学|香港科技大学冯雁教授|KAUST IVUL实验室|香港中文大学(深圳)路广利老师|芝加哥大学徐海峰老师|约翰霍普金斯大学|香港中文大学朱昭颖教授|香港理工大学李菁老师|爱丁堡大学李昌健老师|南方科技大学陈冠华老师|UIUC信息学院汪浩瀚老师
为了更好地了解和满足大家的需求,我们建立了「求职者社群」。
加入 AI 求职社群,你可以享有招聘需求曝光、获取最新面试经验、校招准备攻略、硕博招生和独家内推渠道等服务。
扫描下方小助手的微信,pick 你心仪的岗位~
如何发布招聘
AI 求职是「PaperWeekly」旗下聚焦人工智能领域的招聘平台,涵盖高校硕博招生、博士后招募、企业校招、社招、实习和内推等。
目前已有百度、阿里、腾讯、字节跳动等企业发布内推岗位,欢迎大家订阅关注、发布岗位,如果你也想对公司和在招职位进行更多曝光,请联系我们的栏目负责人(微信:dajun164164)。
·
阅读原文 最新评论
推荐文章
作者最新文章
你可能感兴趣的文章
Copyright Disclaimer: The copyright of contents (including texts, images, videos and audios) posted above belong to the User who shared or the third-party website which the User shared from. If you found your copyright have been infringed, please send a DMCA takedown notice to [email protected]. For more detail of the source, please click on the button "Read Original Post" below. For other communications, please send to [email protected].
版权声明:以上内容为用户推荐收藏至CareerEngine平台,其内容(含文字、图片、视频、音频等)及知识版权均属用户或用户转发自的第三方网站,如涉嫌侵权,请通知[email protected]进行信息删除。如需查看信息来源,请点击“查看原文”。如需洽谈其它事宜,请联系[email protected]。
版权声明:以上内容为用户推荐收藏至CareerEngine平台,其内容(含文字、图片、视频、音频等)及知识版权均属用户或用户转发自的第三方网站,如涉嫌侵权,请通知[email protected]进行信息删除。如需查看信息来源,请点击“查看原文”。如需洽谈其它事宜,请联系[email protected]。