Recommendations
Recommendations
Pathways and Opportunities for Trust
Pathways and Opportunities for Trust
Filter By
GOVT
PVT TECH BUILDERS
CIVIL SOCIETY
RESEARCH ORGS
R
1
Design for the realities of shared access
R
1
.
1
Develop locks that allow multiple accounts on smartphones and platforms, to mitigate fears around surveillance, and increase privacy and safety within shared devices.
R
1
.
2
Ensure interfaces for different user profiles are easily distinguishable and customisable, to secure user agency and control despite shared access. This includes the ability to download platforms specific to profiles, separate file storage facilities only accessible to the active user, and notifications only visible to the active user.
R
1
.
3
Create mechanisms for ‘guest users’, to use a smartphone without being able to access the data of the phone’s regular users.
R
2
Institute Gender-Sensistive Safety Mechanisms
R
2
.
1
Create more ‘women-first’ interaction options on social media platforms, to reduce harm caused by unsolicited messages/calls from unknown users. This would ensure that women do not receive any communication from a male user unless they have initiated the interaction, ultimately reducing threats to their safety.
R
2
.
2
Allow users to retract data at will, to increase their sense of control and safety online.
R
2
.
3
More options for ‘disappearing’ or ‘temporary content’, to give users more control over their information and images. This includes data stored within the platform, like transaction and purchase history.
R
2
.
4
Establish mechanisms to ensure other users cannot take screenshots or video recordings of a user’s images.
R
2
.
5
Learn from the third-party clones of mainstream social media platforms, to better understand how to bridge gaps in privacy and security features.
R
2
.
6
Design transparent, user-forward grievance redressal mechanisms. Create protocols that rank which interactions should be managed by a chatbot versus a human to provide sensitive and appropriate treatment to varying experiences of harm
R
3
Designing for Gender-Based Attitudes to General Risks
R
3
.
1
Create ‘risk’ settings for profiles that enable varying privacy and ‘ease of use’ settings to accommodate the different modes that people choose to take on different platforms.
R
3
.
2
Provide options for first-time users to engage in a trial-and-error interaction, such as sending INR 1 to a known number before undertaking the interaction in an uncontrolled setting. This provides a safe space to test one’s ability to carry out an action.
R
3
.
3
Optimise for helpful frictions, which enable opportunities to think through a decision twice. This includes adding verification frictions for ‘high-stakes’ actions, such as sending a larger sum of money than usual to an unknown/infrequently contacted number.
R
3
.
4
Establish technical standards for reciprocity, particularly in ‘high-stakes’ settings, such as payment and e-commerce platforms. Examples include the acknowledgement of user actions, progress bars for multi-step interactions, and confirmation when transactions are completed. This reduces user confusion and provides clarity on the status of the interaction.
R
3
.
5
Deploy reassuring language around user error to minimise fear-based responses, followed by providing clear pathways to redressal, and friendly assurances around the outcomes. This should also be incorporated during buffering, network-related delays, and interruptions.
R
3
.
6
Incorporate options for users to be able to undo their actions (within a certain time limit) where possible, to reduce fear of permanent outcomes associated with failures to use platforms effectively, or committing mistakes, such as accidentally sending messages or money.
R
3
.
7
Design chatbots to resemble more ‘human’ or ‘friend-like’ interfaces and language, based on local cultural communications and more research on affective responses and gender-sensitive language.
R
4
Create Accessible Thresholds and Pathways for Relatable Engagement
R
4
.
1
Optimise interfaces to load quickly instead of prioritising sophisticated and complex visuals. This ensures uninterrupted access to the platform, even in low-network settings.
R
4
.
2
Follow principles of graceful degradation instead of failing to work altogether so that online platforms can offer a degree of usability in low-network settings. For instance, being able to type out and send a message that will be delivered only when the network improves, instead of the platform not loading at all.
R
4
.
3
In times of slow load or poor network, incorporate clear buffering and confirmation clues, so that the user knows where and how they face connectivity issues, device slowdowns, or lags based on their own error.
R
4
.
4
Map and incorporate digital vocabulary, based on local languages and cultural signifiers, to create recognition, relatability, and ease of use.
R
4
.
5
Provide speech-to-text and voice-recognition pathways to increase accessibility for disability and literacy-based limitations.
R
4
.
6
Provide greater boundaries around time spent to complete an action to account for users that require longer to carry out text-based actions. For instance, providing more time to fill in an OTP.
R
5
Increase Engagement and Learning Formats to Increase Digital Safety and Retention
R
5
.
1
Design simulations for women to navigate a platform journey, before undertaking the actions in real time. This helps build a sense of surety before women have to engage with the unfamiliar territory of the platform for the first time.
R
5
.
2
Provide pictorial and audio-visual explainers for specific harm-based interactions (such as money failing to transfer on a platform) to enable more accessible knowledge about dealing with trust violations.
R
5
.
3
Conduct research to understand where users face the most difficulty in proceeding with steps on a platform, and incorporate nudges and directions that support their interactions at those stages.
R
5
.
4
Explore forms of gamification (streaks, progress, points, nudges, scratch cards) and customisation in essential service platforms to increase engagement. Additionally, test new incentives with existing users to see where they pave the way to becoming early adopters of new functions.
R
6
Create Resilience through Long-Term Thinking around Gender-Based Digital-Skilling Programmes
R
6
.
1
Ensure a higher number of women trainers in digital skilling programmes to establish relatability and relevance for women. This serves to increase enrolment over time.
R
6
.
2
Include provisions for smartphones and/or mobile data packs within scholarship schemes for girls and women.
R
6
.
3
Expand metrics of success to include qualitative learning indicators to prioritise women’s comfort and self-confidence in digital contexts. These include measuring a participant’s ability to apply what they’ve learnt in a different context or evaluating their ability to teach the skill to another person.
R
6
.
4
Institute post-programme mentorship/buddy initiatives to sustain women’s digital skilling and confidence in the long term.
R
7
Combine Established Media Consumption Patterns with Digital Campaigns for Greater Reach
R
7
.
1
Develop a communication plan based on varied behaviours and attitudes of women. Design targeted strategies to reach a broader range of women, while tailoring the approach to factors such as access to phones or literacy levels.
R
7
.
2
Collaborate with trusted public figures and celebrities to endorse digital initiatives. This leads to recognition and familiarity.
R
7
.
3
Integrate actions in advertisements, posters, and offline communications (such as QR odes to learn more) around schemes and platforms to emphasise value for women.
R
7
.
4
Utilise alternative modes of outreach, such as broadband radio and nukkad nataks (street plays) to bridge learnings through a familiar and impactful format for women.
R
7
.
5
Make campaign materials visually engaging and available in more languages to ensure relatability and accessibility across diverse regions.
R
8
Design Strong Messaging for Women's claims of Independent Usage
R
8
.
1
Centre the messaging for privacy online as a critical empowerment tool and not merely a safety measure. Emphasising the strength of women with a deep knowledge of digital security pivots social perceptions from fear of privacy features to greater acceptance of the positive effects of privacy protection.
R
8
.
2
Share stories of women who have overcome digital harassment to reduce the stigma surrounding it within reporting/blocking FAQs. This shifts the blame of online harm to the perpetrators instead of the victims, thus ensuring the fear of harassment does not impede more women from using digital platforms.
R
8
.
3
Champion real-life stories of women who have achieved financial independence through digital means, to create more positive associations of women’s digital usage and success.
R
9
Legitimise Community Structures to Fortify Digital Trust Actions
R
9
.
1
Create helplines and digital support systems rooted in community infrastructures, such as SHGs or Anganwadi centres. This provides accessible, localised, and judgement-free assistance for digital challenges. This could increase user familiarity with a safety mechanism, and build digital legitimacy for last-mile delivery points.
R
9
.
2
Integrate peer-support systems, such as through a user and assistant profile, into government platforms to enhance digital learning and build community-supported trust among users. The success of this structure can lead to the creation of approachable, knowledgeable ‘digital champions’ within communities, who offer advice and support on digital issues
R
9
.
3
Develop ‘digital circles’, modelled after reading groups and facilitated by the local administration, where users share experiences, learn new skills, and solve digital challenges collectively.
R
9
.
4
Design programmes where younger, digitally-native users, are incentivised to teach digital skills to older generations.
R
10
Create Trust-Fostering Digital Environments for Intermediaries
R
10
.
1
Remove restrictions on the types of platforms permitted on smartphones distributed to intermediaries, so they can freely explore other platforms and increase their digital competencies in a safe setting.
R
10
.
2
Establish standards for sufficient storage space and regular replacement of smartphones, to ensure that intermediaries do not experience interruptions in their digital work.
R
10
.
3
Provide training content for intermediaries via visually engaging explainer videos, to improve information retention through their familiar practices of ‘observational learning’.
R
10
.
4
Formalise the role of ‘community digital mentors’, by recognising and celebrating the contributions of intermediaries in supporting hesitant digital users. This can be done through structured programmes of felicitation and public acknowledgement, fostering a culture of celebration and motivation.
R
11
Create Accessible Thresholds and Pathways for Relatable Management
R
11
.
1
Consolidate multiple government platforms into a single, functional platform, or redirect functions between related government platforms to reduce user friction and confusion.
R
11
.
2
Research women’s offline experiences and usage of physical/human touchpoints to understand existing behaviours for a service, and then mimic functionalities through digital design practices and portals. Build upon the offline user journey to increase user familiarity, perceived relevance, and tolerance to friction. E.g., simplifying onboarding for government platforms, as the multiple rounds of verification deter willingness to enrol in, access, and regularly use the service.
R
11
.
3
Leverage existing multipurpose centres for digital services. For instance, utilising Jan Seva Kendras (JSKs) to provide digital services, such as money transfers, Aadhaar card printing, and e-ticketing. Invest in research to understand how centres like JSKs are currently used to optimise for more accessible delivery, and a wider range of services.
R
11
.
4
Develop government platforms to offer certain functions even in low-network settings. Build more resilient functionalities to combat missing links and frequent crashes in existing platforms, to ensure seamless functionality and consistency of performance.
R
11
.
5
Maintain and update platforms regularly to keep pace with user needs. However, updates should be made with minimal changes to the interface to avoid confusing users, particularly those who rely on visuals over text to navigate platforms.
R
11
.
6
To give users greater control and flexibility, provide multiple pathways to action to enable diverse decision-making routes. For instance, UPI’s multiple payment options – phone numbers, IDs, and QR codes.
R
11
.
7
Offer granular privacy options on platforms, such as profile picture controls, around all personally identifiable information. This is crucial even on platforms that do not necessarily create interaction with other users, as it provides users practice in secure digital identity, thus building a safer environment.
R
11
.
8
Collaborate with existing helplines under the Ministry of Women and Child Development (MWCD) to create therapeutic ‘sounding boards’, for confidential reporting and sharing experiences concerning technology-related harms.
R
11
.
9
Design reflexive and more interactive response structures on telehealth services to account for users’ need to ask questions, or for the information to be repeated.
R
11
.
10
Map and incorporate digital vocabulary based on local languages and cultural signifiers to create recognition, relatability, and ease of use.
R
12
Enhance Gender-Intentional Data-Quality, To create Targetted Trust Interventions
R
12
.
1
Collect gender-disaggregated data around digital financial inclusion initiatives to build insights on women’s degrees of engagement, including: a) Number of opportunities accessible by men or women of similar socio-economic backgrounds. b) Difference of engagement with formal finance by men or women of similar socio-economic background, and reasons for the difference. c) Women and men’s perception of quality of engagement with finance schemes. d) Links between overall wellbeing and financial inclusion levels, with disaggregation of men’s and women’s experiences.
R
12
.
2
Collect gender-disaggregated data on the duration of use of online education resources, what types of devices are accessed by users, and the ownership of these devices.
R
12
.
3
Invest in research on gender-intentional privacy and safety features. Create quantitative data on which features are used extensively or sparely by women, and qualitative research on the obstacles and enablers that cause respective usages.
R
12
.
4
Develop pedagogical interventions to increase women’s digital resilience, such as the ability to recognise fraud and make informed digital decisions.
R
12
.
5
Create public transparency communication by collecting data on the number of complaints related to harm to women, decisions taken, resolution rates, and other metrics to increase belief in accountability.
Filter By
GOVT
PVT TECH BUILDERS
CIVIL SOCIETY
RESEARCH ORGS
R
1
Design for the realities of shared access
R
1
.
1
Develop locks that allow multiple accounts on smartphones and platforms, to mitigate fears around surveillance, and increase privacy and safety within shared devices.
R
1
.
2
Ensure interfaces for different user profiles are easily distinguishable and customisable, to secure user agency and control despite shared access. This includes the ability to download platforms specific to profiles, separate file storage facilities only accessible to the active user, and notifications only visible to the active user.
R
1
.
3
Create mechanisms for ‘guest users’, to use a smartphone without being able to access the data of the phone’s regular users.
R
2
Institute Gender-Sensistive Safety Mechanisms
R
2
.
1
Create more ‘women-first’ interaction options on social media platforms, to reduce harm caused by unsolicited messages/calls from unknown users. This would ensure that women do not receive any communication from a male user unless they have initiated the interaction, ultimately reducing threats to their safety.
R
2
.
2
Allow users to retract data at will, to increase their sense of control and safety online.
R
2
.
3
More options for ‘disappearing’ or ‘temporary content’, to give users more control over their information and images. This includes data stored within the platform, like transaction and purchase history.
R
2
.
4
Establish mechanisms to ensure other users cannot take screenshots or video recordings of a user’s images.
R
2
.
5
Learn from the third-party clones of mainstream social media platforms, to better understand how to bridge gaps in privacy and security features.
R
2
.
6
Design transparent, user-forward grievance redressal mechanisms. Create protocols that rank which interactions should be managed by a chatbot versus a human to provide sensitive and appropriate treatment to varying experiences of harm
R
3
Designing for Gender-Based Attitudes to General Risks
R
3
.
1
Create ‘risk’ settings for profiles that enable varying privacy and ‘ease of use’ settings to accommodate the different modes that people choose to take on different platforms.
R
3
.
2
Provide options for first-time users to engage in a trial-and-error interaction, such as sending INR 1 to a known number before undertaking the interaction in an uncontrolled setting. This provides a safe space to test one’s ability to carry out an action.
R
3
.
3
Optimise for helpful frictions, which enable opportunities to think through a decision twice. This includes adding verification frictions for ‘high-stakes’ actions, such as sending a larger sum of money than usual to an unknown/infrequently contacted number.
R
3
.
4
Establish technical standards for reciprocity, particularly in ‘high-stakes’ settings, such as payment and e-commerce platforms. Examples include the acknowledgement of user actions, progress bars for multi-step interactions, and confirmation when transactions are completed. This reduces user confusion and provides clarity on the status of the interaction.
R
3
.
5
Deploy reassuring language around user error to minimise fear-based responses, followed by providing clear pathways to redressal, and friendly assurances around the outcomes. This should also be incorporated during buffering, network-related delays, and interruptions.
R
3
.
6
Incorporate options for users to be able to undo their actions (within a certain time limit) where possible, to reduce fear of permanent outcomes associated with failures to use platforms effectively, or committing mistakes, such as accidentally sending messages or money.
R
3
.
7
Design chatbots to resemble more ‘human’ or ‘friend-like’ interfaces and language, based on local cultural communications and more research on affective responses and gender-sensitive language.
R
4
Create Accessible Thresholds and Pathways for Relatable Engagement
R
4
.
1
Optimise interfaces to load quickly instead of prioritising sophisticated and complex visuals. This ensures uninterrupted access to the platform, even in low-network settings.
R
4
.
2
Follow principles of graceful degradation instead of failing to work altogether so that online platforms can offer a degree of usability in low-network settings. For instance, being able to type out and send a message that will be delivered only when the network improves, instead of the platform not loading at all.
R
4
.
3
In times of slow load or poor network, incorporate clear buffering and confirmation clues, so that the user knows where and how they face connectivity issues, device slowdowns, or lags based on their own error.
R
4
.
4
Map and incorporate digital vocabulary, based on local languages and cultural signifiers, to create recognition, relatability, and ease of use.
R
4
.
5
Provide speech-to-text and voice-recognition pathways to increase accessibility for disability and literacy-based limitations.
R
4
.
6
Provide greater boundaries around time spent to complete an action to account for users that require longer to carry out text-based actions. For instance, providing more time to fill in an OTP.
R
5
Increase Engagement and Learning Formats to Increase Digital Safety and Retention
R
5
.
1
Design simulations for women to navigate a platform journey, before undertaking the actions in real time. This helps build a sense of surety before women have to engage with the unfamiliar territory of the platform for the first time.
R
5
.
2
Provide pictorial and audio-visual explainers for specific harm-based interactions (such as money failing to transfer on a platform) to enable more accessible knowledge about dealing with trust violations.
R
5
.
3
Conduct research to understand where users face the most difficulty in proceeding with steps on a platform, and incorporate nudges and directions that support their interactions at those stages.
R
5
.
4
Explore forms of gamification (streaks, progress, points, nudges, scratch cards) and customisation in essential service platforms to increase engagement. Additionally, test new incentives with existing users to see where they pave the way to becoming early adopters of new functions.
R
6
Create Resilience through Long-Term Thinking around Gender-Based Digital-Skilling Programmes
R
6
.
1
Ensure a higher number of women trainers in digital skilling programmes to establish relatability and relevance for women. This serves to increase enrolment over time.
R
6
.
2
Include provisions for smartphones and/or mobile data packs within scholarship schemes for girls and women.
R
6
.
3
Expand metrics of success to include qualitative learning indicators to prioritise women’s comfort and self-confidence in digital contexts. These include measuring a participant’s ability to apply what they’ve learnt in a different context or evaluating their ability to teach the skill to another person.
R
6
.
4
Institute post-programme mentorship/buddy initiatives to sustain women’s digital skilling and confidence in the long term.
R
7
Combine Established Media Consumption Patterns with Digital Campaigns for Greater Reach
R
7
.
1
Develop a communication plan based on varied behaviours and attitudes of women. Design targeted strategies to reach a broader range of women, while tailoring the approach to factors such as access to phones or literacy levels.
R
7
.
2
Collaborate with trusted public figures and celebrities to endorse digital initiatives. This leads to recognition and familiarity.
R
7
.
3
Integrate actions in advertisements, posters, and offline communications (such as QR odes to learn more) around schemes and platforms to emphasise value for women.
R
7
.
4
Utilise alternative modes of outreach, such as broadband radio and nukkad nataks (street plays) to bridge learnings through a familiar and impactful format for women.
R
7
.
5
Make campaign materials visually engaging and available in more languages to ensure relatability and accessibility across diverse regions.
R
8
Design Strong Messaging for Women's claims of Independent Usage
R
8
.
1
Centre the messaging for privacy online as a critical empowerment tool and not merely a safety measure. Emphasising the strength of women with a deep knowledge of digital security pivots social perceptions from fear of privacy features to greater acceptance of the positive effects of privacy protection.
R
8
.
2
Share stories of women who have overcome digital harassment to reduce the stigma surrounding it within reporting/blocking FAQs. This shifts the blame of online harm to the perpetrators instead of the victims, thus ensuring the fear of harassment does not impede more women from using digital platforms.
R
8
.
3
Champion real-life stories of women who have achieved financial independence through digital means, to create more positive associations of women’s digital usage and success.
R
9
Legitimise Community Structures to Fortify Digital Trust Actions
R
9
.
1
Create helplines and digital support systems rooted in community infrastructures, such as SHGs or Anganwadi centres. This provides accessible, localised, and judgement-free assistance for digital challenges. This could increase user familiarity with a safety mechanism, and build digital legitimacy for last-mile delivery points.
R
9
.
2
Integrate peer-support systems, such as through a user and assistant profile, into government platforms to enhance digital learning and build community-supported trust among users. The success of this structure can lead to the creation of approachable, knowledgeable ‘digital champions’ within communities, who offer advice and support on digital issues
R
9
.
3
Develop ‘digital circles’, modelled after reading groups and facilitated by the local administration, where users share experiences, learn new skills, and solve digital challenges collectively.
R
9
.
4
Design programmes where younger, digitally-native users, are incentivised to teach digital skills to older generations.
R
10
Create Trust-Fostering Digital Environments for Intermediaries
R
10
.
1
Remove restrictions on the types of platforms permitted on smartphones distributed to intermediaries, so they can freely explore other platforms and increase their digital competencies in a safe setting.
R
10
.
2
Establish standards for sufficient storage space and regular replacement of smartphones, to ensure that intermediaries do not experience interruptions in their digital work.
R
10
.
3
Provide training content for intermediaries via visually engaging explainer videos, to improve information retention through their familiar practices of ‘observational learning’.
R
10
.
4
Formalise the role of ‘community digital mentors’, by recognising and celebrating the contributions of intermediaries in supporting hesitant digital users. This can be done through structured programmes of felicitation and public acknowledgement, fostering a culture of celebration and motivation.
R
11
Create Accessible Thresholds and Pathways for Relatable Management
R
11
.
1
Consolidate multiple government platforms into a single, functional platform, or redirect functions between related government platforms to reduce user friction and confusion.
R
11
.
2
Research women’s offline experiences and usage of physical/human touchpoints to understand existing behaviours for a service, and then mimic functionalities through digital design practices and portals. Build upon the offline user journey to increase user familiarity, perceived relevance, and tolerance to friction. E.g., simplifying onboarding for government platforms, as the multiple rounds of verification deter willingness to enrol in, access, and regularly use the service.
R
11
.
3
Leverage existing multipurpose centres for digital services. For instance, utilising Jan Seva Kendras (JSKs) to provide digital services, such as money transfers, Aadhaar card printing, and e-ticketing. Invest in research to understand how centres like JSKs are currently used to optimise for more accessible delivery, and a wider range of services.
R
11
.
4
Develop government platforms to offer certain functions even in low-network settings. Build more resilient functionalities to combat missing links and frequent crashes in existing platforms, to ensure seamless functionality and consistency of performance.
R
11
.
5
Maintain and update platforms regularly to keep pace with user needs. However, updates should be made with minimal changes to the interface to avoid confusing users, particularly those who rely on visuals over text to navigate platforms.
R
11
.
6
To give users greater control and flexibility, provide multiple pathways to action to enable diverse decision-making routes. For instance, UPI’s multiple payment options – phone numbers, IDs, and QR codes.
R
11
.
7
Offer granular privacy options on platforms, such as profile picture controls, around all personally identifiable information. This is crucial even on platforms that do not necessarily create interaction with other users, as it provides users practice in secure digital identity, thus building a safer environment.
R
11
.
8
Collaborate with existing helplines under the Ministry of Women and Child Development (MWCD) to create therapeutic ‘sounding boards’, for confidential reporting and sharing experiences concerning technology-related harms.
R
11
.
9
Design reflexive and more interactive response structures on telehealth services to account for users’ need to ask questions, or for the information to be repeated.
R
11
.
10
Map and incorporate digital vocabulary based on local languages and cultural signifiers to create recognition, relatability, and ease of use.
R
12
Enhance Gender-Intentional Data-Quality, To create Targetted Trust Interventions
R
12
.
1
Collect gender-disaggregated data around digital financial inclusion initiatives to build insights on women’s degrees of engagement, including: a) Number of opportunities accessible by men or women of similar socio-economic backgrounds. b) Difference of engagement with formal finance by men or women of similar socio-economic background, and reasons for the difference. c) Women and men’s perception of quality of engagement with finance schemes. d) Links between overall wellbeing and financial inclusion levels, with disaggregation of men’s and women’s experiences.
R
12
.
2
Collect gender-disaggregated data on the duration of use of online education resources, what types of devices are accessed by users, and the ownership of these devices.
R
12
.
3
Invest in research on gender-intentional privacy and safety features. Create quantitative data on which features are used extensively or sparely by women, and qualitative research on the obstacles and enablers that cause respective usages.
R
12
.
4
Develop pedagogical interventions to increase women’s digital resilience, such as the ability to recognise fraud and make informed digital decisions.
R
12
.
5
Create public transparency communication by collecting data on the number of complaints related to harm to women, decisions taken, resolution rates, and other metrics to increase belief in accountability.

Aapti Institute
37, Aga Abbas Ali Rd, Halasuru Yellappa Chetty Layout, Sivanchetti Gardens, Bengaluru, Karnataka 560042
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 2.5 India License.View a copy of this license at creativecommons.org/licenses/by-nc-sa/2.5/in/
Write to us at
© 2025 Aapti Institute. All rights reserved.
Aapti Institute
37, Aga Abbas Ali Rd, Halasuru Yellappa Chetty Layout, Sivanchetti Gardens, Bengaluru, Karnataka 560042
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 2.5 India License.View a copy of this license at creativecommons.org/licenses/by-nc-sa/2.5/in/
Write to us at
© 2025 Aapti Institute. All rights reserved.

