RSS Feed Source: Academic Keys

Job ID: 255013

INESC TEC | Research Grant (AE2025-0148)
INESC TEC

Research Opportunities

Efficiency and Effectivess of educational systems

Work description

Contextualisation of the education landscape in the European Union. Literature review, as thorough as possible. Identification of the knowledge gap that the dissertation aims to cover. Collection and processing of data, including the contextual variables relevant to the problem. Implementation of Robust Conditional Approaches, based on Data Envelopment Analysis (DEA) or Benefit of the Doubt (BoD) models. Implementation of Goal Programming approaches. Analysis of the results. Discussion of the results.

Academic Qualifications

Master Degree in Industrial Engineering and Management.

Minimum profile required

Average grade of 18 in the Master Degree in Industrial Engineering and Management.

Preference factors

Good knowledge of the English language. The

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Research Opportunities

Artificial intelligence and remote observation for IoT and Robotics

Work description

The TRIBE LAB Laboratory – Robotics and IoT for precision agriculture and forestry has been working on robotic and IoT technology that needs to be integrated with a decision support system, to increase its autonomy and impact. In this context, the aim is to explore new projects, publications and concepts integrating Artificial Intelligence and data provided by satellites, drones, robots and IoT to obtain prescription, productivity and action maps, which can support the farmer in his decision-making, as well how to automate the connection between the decision support system and robots.

Academic Qualifications

PhD in agricultural production chains, computer sciences, engineering and similar.

Minimum profile required

Experience with geographic information systems. More than 10 scientific publications in indexed journals. Proven experience in using machine learning techniques using Python.

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Implementation Update: Promoting Maximal Transparency Under the NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules

Institutional Biosafety Committees (IBCs) serve as a critical linchpin in ensuring the safe and responsible conduct of research. Since the issuance of the NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules (NIH Guidelines) in 1976, IBCs have expanded in number to the thousands and have voluntarily expanded their roles to encompass new research approaches as they arise.

IBCs continue to serve as a pillar of biosafety oversight and are essential in building trust on behalf of the biomedical research enterprise. Under the NIH Guidelines, this expectation is a mandate and as such, NIH is reinforcing its commitment to working with IBCs to ensure transparency in biosafety oversight by updating its implementation expectations to protect the safety of all

Click this link to continue reading the article on the source website.

RSS Feed Source: Academic Keys

Protecting Human Genomic Data when Developing Generative Artificial Intelligence Tools and Applications

Artificial intelligence (AI) tools and applications are proving to be transformative for driving new biomedical research advances. While development and use of generative AI is becoming increasingly prevalent, NIH urges the research community to remain vigilant of potential risks of inadvertent data disclosure when sharing AI tools and applications. Specifically, NIH reminds researchers that:

The Genomic Data Sharing (GDS) Policy and the subsequent Data Use Certification (DUC) Agreement prohibit users from distributing controlled-access data (including genomic or associated data) or their Data Derivatives to any entity or individual not identified in their Data Access Request without appropriate written approvals from the NIH. Sharing, retaining, or training generative AI models using controlled-access human genomic data may risk disclosing controlled-access data and, thus, violates the Non-Transferability provision of

Click this link to continue reading the article on the source website.