FAQ

  • While AI is increasingly being used in the field of mental health, it is unlikely to completely replace human therapists. AI has the potential to provide a range of benefits to both therapists and clients, including more personalized treatment plans, improved tracking of symptoms, and increased accessibility to mental health care.

    While AI can assist therapists in diagnosing and treating mental health conditions, it is ultimately up to the human therapist to interpret the data and make decisions about patient care.

    In summary, while AI can be a valuable tool in mental health care, it is unlikely to completely replace human therapists. Rather, it is more likely that AI will be used in conjunction with human therapists to provide more effective and personalized mental health care.

  • In this cross-sectional study of 195 randomly drawn patient questions from a social media forum, a team of licensed health care professionals compared physician’s and chatbot’s responses to patient’s questions asked publicly on a public social media forum. The chatbot responses were preferred over physician responses and rated significantly higher for both quality and empathy.

    Of the 195 questions and responses, evaluators preferred chatbot responses to physician responses in 78.6% (95% CI, 75.0%-81.8%) of the 585 evaluations. Mean (IQR) physician responses were significantly shorter than chatbot responses (52 [17-62] words vs 211 [168-245] words; t = 25.4; P < .001). Chatbot responses were rated of significantly higher quality than physician responses (t = 13.3; P < .001). The proportion of responses rated as good or very good quality (≥ 4), for instance, was higher for chatbot than physicians (chatbot: 78.5%, 95% CI, 72.3%-84.1%; physicians: 22.1%, 95% CI, 16.4%-28.2%;). This amounted to 3.6 times higher prevalence of good or very good quality responses for the chatbot. Chatbot responses were also rated significantly more empathetic than physician responses (t = 18.9; P < .001). The proportion of responses rated empathetic or very empathetic (≥4) was higher for chatbot than for physicians (physicians: 4.6%, 95% CI, 2.1%-7.7%; chatbot: 45.1%, 95% CI, 38.5%-51.8%; physicians: 4.6%, 95% CI, 2.1%-7.7%). This amounted to 9.8 times higher prevalence of empathetic or very empathetic responses for the chatbot.

    In this cross-sectional study, a chatbot generated quality and empathetic responses to patient questions posed in an online forum. Further exploration of this technology is warranted in clinical settings, such as using chatbot to draft responses that physicians could then edit. Randomized trials could assess further if using AI assistants might improve responses, lower clinician burnout, and improve patient outcomes.

  • Stigma: There is still a significant amount of stigma around mental health issues, and many people may feel ashamed or embarrassed to seek help from a therapist. This stigma can be especially pronounced in certain cultures or communities where mental health issues are viewed as a weakness or a personal failing.

    Lack of Awareness: Some people may not be aware of the benefits of therapy or may not understand what it entails. They may have misconceptions about therapy or fear that it will be a waste of time and money.

    Fear of Judgement: People may worry about being judged by a therapist, or fear that their therapist will think less of them if they reveal personal details or vulnerabilities.

    Trust Issues: Some people may have trust issues and may find it difficult to open up to a stranger, especially if they have had negative experiences in the past.Financial Constraints: For some, the cost of therapy may be a significant barrier. Even with insurance, therapy can be expensive and may not be covered by all plans.

    Accessibility: In some areas, there may be a shortage of therapists or a lack of access to mental health services, particularly in rural or low-income communities.

    It is important to address these barriers and work towards making therapy more accessible and acceptable to all individuals who may benefit from it. This can involve reducing the stigma surrounding mental health, increasing awareness of the benefits of therapy, and making mental health services more affordable and accessible.

  • Murror aims to help users gain awareness of mental health issues and realize the importance of support. We do not aim to have AI become a therapist, but rather an assistant allowing users to understand their issues, discover themselves, practice mindfulness, and improve mental strength. Since seeing a mental health professional is a huge step that a lot of people don’t take, Murror could help users become aware of their mental health problems and make them feel more comfortable seeing a professional for support.

  • Building an AI mental health app with the highest privacy and security standards requires careful planning and attention to detail. Here are some key steps you can take to ensure the privacy and security of your users:

    Conduct a Privacy Impact Assessment (PIA): A PIA is a process that helps you identify and assess the potential privacy risks associated with your app. It involves analyzing the data that your app collects, how it's used, and who has access to it. You should also assess the security controls that you have in place and identify any potential vulnerabilities.

    Follow Privacy Regulations: You need to comply with privacy regulations such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). These regulations set out specific requirements for how you collect, use, and store personal data.

    Use End-to-End Encryption: End-to-end encryption is a security measure that ensures that only the sender and recipient of a message can read it. This can help prevent unauthorized access to sensitive data.

    Implement Multi-Factor Authentication: Multi-factor authentication (MFA) is a security measure that requires users to provide multiple forms of identification to access their accounts. This can help prevent unauthorized access to sensitive data.

    Store Data Securely: Ensure that all user data is stored securely and encrypted. You should also implement secure backup and recovery processes to prevent data loss.

    Conduct Regular Security Audits: Regular security audits can help identify any potential vulnerabilities in your app's security controls. This can help you identify and address any weaknesses before they can be exploited by hackers.

    Use Trusted AI Models: Use trusted AI models that have been developed by reputable researchers and have undergone rigorous testing. This can help ensure that your app's recommendations are accurate and reliable.

    Overall, building an AI mental health app with high privacy and security requires a holistic approach that considers all aspects of data privacy and security. By following best practices and complying with regulations, you can build an app that is trusted and valued by users.

Limited free beta access