RSS Feed Source: Academic Keys

Research Opportunities

Artificial intelligence and remote observation for IoT and Robotics

Work description

The TRIBE LAB Laboratory – Robotics and IoT for precision agriculture and forestry has been working on robotic and IoT technology that needs to be integrated with a decision support system, to increase its autonomy and impact. In this context, the aim is to explore new projects, publications and concepts integrating Artificial Intelligence and data provided by satellites, drones, robots and IoT to obtain prescription, productivity and action maps, which can support the farmer in his decision-making, as well how to automate the connection between the decision support system and robots.

Academic Qualifications

PhD in agricultural production chains, computer sciences, engineering and similar.

Minimum profile required

Experience with geographic information systems. More than 10 scientific publications in indexed journals. Proven experience in using machine learning techniques using Python.

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Research Opportunities

Operations Research

Work description

Development of a solution method destined to solving a network design problem within the scope of the insect-based products supply chain. Analysis of the obtained results based on sensitivity analyses.

Academic Qualifications

Master’s Degree in Engineering and Industrial Management, or similar.

Minimum profile required

Proficiency in English. Knowledge in linear and integer programming. Programming skills in Python and C++. Previous contact with the development of approximate solution methods for optimization problems.

Preference factors

Previous experience in decision methods applied to supply chain planning.

Maintenance stipend: € 1309,64, according to the table of monthly maintenance stipend for FCT grants , paid via bank transfer. Grant holders may be awarded potential supplements, according to a quarterly evaluation process (Articles 19, 21 and 22 of the Regulations for Grants of INESC TEC and Annex

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Implementation Update: Promoting Maximal Transparency Under the NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules

Institutional Biosafety Committees (IBCs) serve as a critical linchpin in ensuring the safe and responsible conduct of research. Since the issuance of the NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules (NIH Guidelines) in 1976, IBCs have expanded in number to the thousands and have voluntarily expanded their roles to encompass new research approaches as they arise.

IBCs continue to serve as a pillar of biosafety oversight and are essential in building trust on behalf of the biomedical research enterprise. Under the NIH Guidelines, this expectation is a mandate and as such, NIH is reinforcing its commitment to working with IBCs to ensure transparency in biosafety oversight by updating its implementation expectations to protect the safety of all

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Protecting Human Genomic Data when Developing Generative Artificial Intelligence Tools and Applications

Artificial intelligence (AI) tools and applications are proving to be transformative for driving new biomedical research advances. While development and use of generative AI is becoming increasingly prevalent, NIH urges the research community to remain vigilant of potential risks of inadvertent data disclosure when sharing AI tools and applications. Specifically, NIH reminds researchers that:

The Genomic Data Sharing (GDS) Policy and the subsequent Data Use Certification (DUC) Agreement prohibit users from distributing controlled-access data (including genomic or associated data) or their Data Derivatives to any entity or individual not identified in their Data Access Request without appropriate written approvals from the NIH. Sharing, retaining, or training generative AI models using controlled-access human genomic data may risk disclosing controlled-access data and, thus, violates the Non-Transferability provision of

Click this link to continue reading the article on the source website.