• Home
  • Courses
  • M.Sc Admission
  • CSE Job Preparation
  • Question Bank
  • Study
    • MCQ Study
    • Written Study
    • Job Guideline Blog
    • latest News
    • Job Solution -BCS
  • Notices
  • Circulars
  • Contact
Login

Written Study

...

Top Open-Source LLMs for 2024 and T...

Top Open-Source LLMs for 2024 and Their UsesDiscover some of the most powerful open-source LLMs and why they will be crucial for the future of generative AIThe current generative AI revolution wouldn’t be possible without the so-called large language models (LLMs). Based on transformers, a powerful neural architecture, LLMs are AI systems used to model and process human language. They are called “large” because they have hundreds of millions or even billions of parameters, which are pre-trained using a massive corpus of text data.Start ourLarge Language Models (LLMs) Concepts Course today to learn more about how LLMs work.LLM are the foundation models of popular and widely-used chatbots, like ChatGPT and Google Bard. In particular, ChatGPT is powered by GPT-4, a LLM developed and owned by OpenAI, while Google Bard is based on Google’s PaLM 2 model.ChatGPT and Bard, as well as many other popular chatbots, have in common that their underlying LLM are proprietary. That means that they are owned by a company and can only be used by customers after buying a license. That license comes with rights, but also with possible restrictions on how to use the LLM, as well as limited information on the mechanisms behind the technology.Yet, a parallel movement in the LLM space is rapidly gaining pace: open-source LLMs. Following rising concerns over the lack of transparency and limited accessibility of proprietary LLMs, mainly controlled by Big Tech, such as Microsoft, Google, and Meta, open-source LLMs promise to make the rapidly growing field of LMMs and generative AI more accessible, transparent, and innovative.This article aims to explore the top open-source LLMs available in 2023. Although it’s been only a year since the launch of ChatGPT and the popularization of (proprietary) LLMs, the open-source community has already achieved important milestones, with a good number of open-source LLMs available for different purposes. Keep reading to check the most popular ones!Develop AI ApplicationsLearn to build AI applications using the OpenAI API.Start Upskilling For FreeBenefits of Using Open-Source LLMsThere are multiple short-term and long-term benefits to choosing open-source LLMs instead of proprietary LLMs. Below, you can find a list of the most compelling reasons:Enhanced data security and privacyOne of the biggest concerns of using proprietary LLMs is the risk of data leaks or unauthorized access to sensitive data by the LLM provider. Indeed, there have already been several controversies regarding the alleged use of personal and confidential data for training purposes.By using open-source LLM, companies will be solely responsible for the protection of personal data, as they will keep full control of it.Cost savings and reduced vendor dependencyMost proprietary LLMs require a license to use them. In the long term, this can be an important expense that some companies, especially SME ones, may not be able to afford. This is not the case with open-source LLMs, as they are normally free to use.However, it’s important to note that running LLMs requires considerable resources, even only for inference, which means that you will normally have to pay for the use of cloud services or powerful infrastructure.Code transparency and language model customizationCompanies that opt for open-source LLMs will have access to the workings of LLMs, including their source code, architecture, training data, and mechanism for training and inference. This transparency is the first step for scrutiny but also for customization.Since open-source LLMs are accessible to everyone, including their source code, companies using them can customize them for their particular use cases.Active community support and fostering innovationThe open-source movement promises to democratize the use and access of LLM and generative AI technologies. Allowing developers to inspect the inner workings of LLMs is key for the future development of this technology. By lowering entry barriers to coders around the world, open-source LLMs can foster innovation and improve the models by reducing biases and increasing accuracy and overall performance.Addressing the environmental footprint of AIFollowing the popularization of LLMs, researchers and environmental watchdogs are raising concerns about the carbon footprint and water consumption required to run these technologies. Proprietary LLMs rarely publish information on the resources required to train and operate LLMs, nor the associated environmental footprint.With open-source LLM, researchers have more chances to know about this information, which can open the door for new improvements designed to reduce the environmental footprint of AI.8 Top Open-Source Large Language Models For 20241. LLaMA 3.1Most top players in the LLM space have opted to build their LLM behind closed doors. However, Meta continues to be an exception with its series of open-source LLMs, which now includes the latest LLaMA 3.1.Released on July 23, 2024, LLaMA 3.1 includes models with 8B, 70B, and for the first time, 405B parameters, making it the largest in the series. These models have been designed to handle a variety of natural language processing tasks across multiple languages including English, Spanish, Portuguese, German, Thai, French, Italian, and Hindi.The LLaMA 3.1 models support a vastly increased context length of 128,000 tokens, which enhances their ability to process and understand lengthy texts, thus significantly improving performance on complex reasoning tasks and maintaining context in longer conversations.The 405B model, in particular, is a powerhouse for synthetic data generation, which can be used to train other models, and for knowledge distillation, allowing the knowledge from this large model to be transferred to smaller, more efficient models. This capability opens up new possibilities for deploying advanced AI in resource-constrained environments.Moreover, LLaMA 3.1 continues to leverage reinforcement learning from human feedback (RLHF), ensuring that the models align with human preferences for helpfulness and safety.To learn more about LLaMA, check out our Introduction to Meta AI’s LLaMA and our Fine-Tuning LLaMA 3.1 article.2. BLOOMLaunched in 2022 following a year-long collaborative project with volunteers from 70+ countries and researchers from Hugging Face, BLOOM is an autoregressive LLM trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources.The release of BLOOM marked an important milestone in democratizing generative AI. With 176 billion parameters, BLOOM is one of the most powerful open-source LLMs, with capabilities to provide coherent and accurate text in 46 languages and 13 programming languages.Transparency is the backbone of BLOOM, a project where everyone can access the source code and the training data in order to run, study, and improve it.BLOOM can be used for free through the Hugging Face ecosystem.3. BERTThe underlying technology of LLM is a type of neural architecture called a transformer. It was developed in 2017 by Google researchers in the paper Attention is All You Need. One of the first experiments to test the potential of transformers was BERT.Launched in 2018 by Google as an open-source LLM, BERT (stands for Bidirectional Encoder Representations from Transformers), rapidly achieved state-of-the-art performance in many natural language processing tasks.Thanks to its innovative features back in the early days of LLMs and its open-source nature, Bert is one of the most popular and widely used LLMs. For example, in 2020, Google announced that it had adopted Bert through Google Search in over 70 languages.There are currently thousands of open-source, free, and pre-trained Bert models available for specific use cases, such as sentiment analysis, clinical note analysis, and toxic comment detection.Interested in the possibilities of BERT? Check out our Introduction to BERT article.4. Falcon 180BIf the Falcon 40B already impressed the open-source LLM community (it ranked #1 on Hugging Face’s leaderboard for open-source large language models), the new Falcon 180B suggests that the gap between proprietary and open-source LLMs is rapidly closing.Released by the Technology Innovation Institute of the United Arab Emirates in September 2023, Falcon 180B is being trained on 180 billion parameters and 3.5 trillion tokens. With this impressive computing power, Falcon 180B has already outperformed LLaMA 2 and GPT-3.5 in various NLP tasks, and Hugging Face suggests it can rival Google’s PaLM 2, the LLM that powers Google Bard.Although free for commercial and research use, it’s important to note that Falcon 180B requires important computing resources to function.5. OPT-175BThe release of the Open Pre-trained Transformers Language Models (OPT) in 2022 marked another important milestone in Meta’s strategy to liberate the LLM race through open source.OPT comprises a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters. OPT-175B, one of the most advanced open-source LLMs in the market, is the most powerful brother, with similar performance to GPT-3. Both pre-trained models and the source code are available to the public.Yet, if you’re thinking in developing an AI-driven company with LLMs, you’d better think in another one, as OPT-175B is released under a non-commercial license, allowing only the use of the model for research use cases.6. XGen-7BMore and more companies are jumping into the LLM race. One of the last to jump into the ring was Salesforce, which launched its XGen-7B LLM in July 2023.According to the authors, most open-source LLMs focus on providing large answers with limited information (i.e., short prompts with little context). The idea behind XGen-7B is to build a tool that supports longer context windows. In particular, the most advanced variance of XGen (XGen-7B-8K-base) allows for an 8K context window, that is, the cumulative size of the input and output text.Efficiency is another important priority in XGen, which uses only 7B parameters for training, way less than most powerful open-source LLMs, like LLaMA 2 or Falcon.Despite its relatively small size, XGen can still deliver great results. The model is available for commercial and research purposes, except theXGen-7B-{4K,8K}-inst variant, which has been trained on instructional data and RLHF and is released under a noncommercial license.7. GPT-NeoX and GPT-JDeveloped by researchers from EleutherAI, a non-profit AI research lab, GPT-NeoX and GPT-J are two great open-source alternatives to GPT.GPT-NeoX has 20 billion parameters, while GPT-J has 6 billion parameters. Although most advanced LLMs can be trained with over 100 billion parameters, these two LLMs can still deliver results with high accuracy.They have been trained with 22 high-quality datasets from a diverse set of sources that enable their use in multiple domains and many use cases. In contrast with GPT-3, GPT-NeoX and GPT-J haven’t been trained with RLHF.Any natural language processing task can be performed with GPT-NeoX and GPT-J, from text generation and sentiment analysis to research and marketing campaign development.Both LLMs are available for free through the NLP Cloud API.8. Vicuna 13-BVicuna-13B is an open-source conversational model trained from fine-tuning the LLaMa 13B model using user-shared conversations gathered from ShareGPT.As an intelligent chatbot, the applications of Vicuna-13B are countless, and some of them are illustrated below in different industries, such as customer service, healthcare, education, finance, and travel/hospitality.A preliminary evaluation using GPT-4 as a judge showed Vicuna-13B achieving more than 90% quality of ChatGPT and Google Bard, then outperformed other models like LLaMa and Alpaca in more than 90% of cases.Choosing the Right Open-Source LLM for Your NeedsThe open-source LLM space is rapidly expanding. Today, there are many more open-source LLMs than proprietary ones, and the performance gap may be bridged soon as developers worldwide collaborate to upgrade current LLMs and design more optimized ones.In this vibrant and exciting context, it may be difficult to choose the right open-source LLM for your purposes. Here is a list of some of the factors you should think about before opting for one specific open-source LLM:What do you want to do? This is the first thing you have to ask yourself. Open-source LLM are always open, but some of them are only released for research purposes. Hence, if you’re planning to start up a company, be aware of the possible licensing limitations.Why do you need a LLM? This is also extremely important. LLMs are currently in vogue. Everyone’s speaking about them and their endless opportunities. But if you can build your idea without needing LLMs, then don’t use them. It’s not mandatory (and you will probably save a lot of money and prevent further resource use).How much accuracy do you need? This is an important aspect. There is a direct relationship between the size and accuracy of state-of-the-art LLMs. This means, overall, that the bigger the LLM in terms of parameters and training data, the more accurate the model will be. So, if you need high accuracy, you should opt for bigger LLMs, such as LLaMA or Falcon.How much money do you want to invest? This is closely connected with the previous question. The bigger the model, the more resources will be required to train and operate the model. This translates into additional infrastructure to be used or a higher bill from cloud providers in case you want to operate your LLM in the cloud. LLMs are powerful tools, but they require considerable resources to use them, even open-source ones.Can you achieve your goals with a pre-trained model? Why invest money and energy in training your LLM from scratch if you can simply use a pre-trained model? Out there there are many versions of open-source LLMs trained for a specific use case. If your idea fits in one of these use cases, just for it.Upskilling Your Team with AI and LLMsOpen-source LLMs aren't just for individual projects or interests. As the generative AI revolution continues to accelerate, businesses are recognizing the critical importance of understanding and implementing these tools. LLMs have already become foundational in powering advanced AI applications, from chatbots to complex data processing tasks. Ensuring that your team is proficient in AI and LLM technologies is no longer just a competitive advantage—it's a necessity for future-proofing your business.If you’re a team leader or business owner looking to empower your team with AI and LLM expertise, DataCamp for Business offers comprehensive training programs that can help your employees gain the skills needed to leverage these powerful tools. We provide:Targeted AI and LLM learning paths: Customizable to align with your team’s current knowledge and the specific needs of your business, covering everything from basic AI concepts to advanced LLM development.Hands-on AI practice: Real-world projects that focus on building and deploying AI models, including working with popular LLMs like GPT-4 and open-source alternatives.Progress tracking in AI skills: Tools to monitor and assess your team’s progress, ensuring they acquire the skills needed to develop and implement AI solutions effectively.Investing in AI and LLM upskilling not only enhances your team’s capabilities but also positions your business at the forefront of innovation, enabling you to harness the full potential of these transformative technologies. Get in touch with our team to request a demo and start building your AI-ready workforce today.ConclusionOpen-source LLMs are in an exciting movement. With their rapid evolution, it seems that the generative AI space won’t necessarily be monopolized by the big players who can afford to build and use these powerful tools.We’ve only seen eight open-source LLMs, but the number is much higher and rapidly growing. We at DataCamp will continue to provide information about the latest news in the LLM space, providing courses, articles, and tutorials about LLMs. For now, check out our list of curated materials:Large Language Models (LLMs) Concepts CourseHow to build LLM applications with LangChainHow to train an LLM with PyTorchLlamaIndex: Adding Personal Data to LLMsThe Pros and Cons of Using LLMs in the Cloud Versus Running LLMs Locally

Details
...

What is AI? A Quick-Start Guide For...

What is AI? A Quick-Start Guide For Beginners Find out what artificial intelligence really is with examples, expert input, and all the tools you need to learn more. The term AI has been in vogue for a while now. Artificial intelligence is a concept that has existed for many years, and various technologies rely on AI to function. But with tools like ChatGPT and Google Bard making headlines, it feels like a new age of artificial intelligence is upon us. But what is AI? And how does it work? Here, we take a brief look at what AI is, why it matters, and how you can learn more about this fascinating field. As well as examples of where artificial is used, we’ve included quotes from experts and resources for further reading. What is AI? Artificial Intelligence is a subfield of computer science that focuses on creating intelligent agents capable of performing tasks that would typically require human levels of intelligence. These tasks include problem-solving, speech recognition, and decision-making, among others. AI is an interdisciplinary science with many approaches; it can be rule-based and operate under a predefined set or conditions, or it can use machine learning algorithms to adapt to its environment. The latter is particularly powerful, as it allows AI systems to learn from data, making them more versatile and capable of handling unforeseen scenarios. AI in relation to data science and other key concepts Common misconceptions It’s also worth mentioning what AI isn’t. There are a lot of misconceptions about what artificial is, and here are some common incorrect beliefs: AI is synonymous with robots. AI is not limited to robotics; it's a broader field that includes various technologies like search algorithms and natural language processing. AI can surpass human intelligence anytime soon. The idea that AI will soon outsmart humans is exaggerated; Artificial General Intelligence (AGI) is still theoretical and far from realization. AI understands content like humans do. AI doesn't "understand" text or speech in the human sense; it processes data based on patterns but lacks comprehension. AI is unbiased. Contrary to belief, AI can inherit biases from its training data or designers, meaning it’s not inherently unbiased. Check out our guide on the ethics of generative AI to learn more. AI can replace all human jobs. While AI can automate specific tasks, it cannot replace jobs that require emotional intelligence, creativity, and other human-specific skills. You can explore the differences between AI and machine learning, and machine learning and deep learning, in separate guides.  AI Glossary We’ll use a range of terms in our exploration of artificial intelligence, some of which will be unfamiliar. We’ve created a list of key artificial intelligence terms and their meanings: Algorithm A set of rules or instructions that a computer follows to perform a specific task. Algorithms are the building blocks of all AI systems. Artificial General Intelligence (AGI) A currently theoretical form of AI that has the ability to understand, learn, and apply knowledge across different domains, reason through problems, have consciousness and even have emotional understanding. This is in contrast to Narrow AI, which is designed and trained for a specific task. Deep Learning A specialized type of machine learning that mimics the way our brain works, allowing computers to learn from experience and understand the world in terms of a hierarchy of concepts. In simple terms, deep learning is like a virtual brain that helps computers learn from data so they can make decisions on their own. Machine Learning A way to give computers the ability to learn from data and make decisions without being explicitly programmed. Think of it as teaching computers to learn from experience, much like how humans do. In essence, machine learning is the method by which AI gets the "intelligence" part of its name. You can learn more about the topic in our Understanding Machine Learning course. Natural Language Processing (NLP) A field of AI that focuses on the interaction between computers and humans through natural language. The ultimate objective of NLP is to enable computers to understand, interpret, and generate human languages in a way that is both meaningful and useful. Check out our Introduction to Natural Language Processing in Python course to find out more. Neural Network A computing model inspired by the structure of neurons in the human brain. Neural networks are used in various applications that involve pattern recognition, such as image and voice recognition. We have a whole article exploring what neural networks are. Types of Artificial Intelligence AI can be categorized based on its capabilities and functionalities. When it comes to capabilities, we can distinguish between between types of AI in the following ways: Narrow AI Also known as Weak AI, narrow AI is designed and trained to perform a specific task. It operates under a limited pre-defined set of conditions and doesn't possess the broad range of capabilities that humans have. Most current AI systems, including reactive machines and limited memory machines (see below), fall under this category. Artificial General Intelligence (AGI) Also known as General AI, this type of AI would have the ability to understand, learn, and apply knowledge across different domains. It would be capable of self-awareness, reasoning, and emotional understanding. General AI remains largely theoretical at this point. Artificial Super Intelligence This is an advanced form of AI that would surpass human intelligence in nearly all aspects, from creativity and social intelligence to problem-solving abilities. Super AI is a concept that exists more in the realm of science fiction and future speculation than in current reality. We can also explore the types of artificial intelligence based on its functionality: Reactive machines These are the most basic forms of AI, designed to perform specific tasks. For example, IBM's Deep Blue, a chess-playing supercomputer, falls under this category. Reactive machines cannot store memories or use past experiences to inform current decisions. Limited memory Limited Memory AI can store past data and use it to make better predictions or decisions. This type of AI is commonly found in recommendation systems like those used by Netflix or Amazon. Theory of mind This is a theoretical concept that refers to AI systems potentially understanding human emotions, beliefs, and thoughts. While intriguing, we have yet to achieve this level of AI sophistication. Self-awareness The pinnacle of AI development would be self-aware machines that understand their existence and can make decisions based on self-interest. This remains a subject of ongoing research and ethical debate. Applications and Examples of AI AI's reach extends far beyond academia and specialized industries. Here are some ways in which artificial intelligence is used in today’s world: Everyday technology AI is deeply integrated into the technologies we use daily. From Google Maps optimizing your route based on real-time traffic data to Siri and Alexa setting your alarms and answering your questions, AI is near omnipresent. These applications often use Narrow AI to perform specific tasks efficiently. Business and industry The business world is already embracing AI, with an IBM survey finding that more than a third of companies (35%) reported using AI in their business in 2022. Organizations across many sectors are finding uses for artificial intelligence, including: Healthcare. AI algorithms can analyze medical images to identify early signs of diseases like cancer. They can also assist in drug discovery by predicting how different compounds can treat diseases. Finance. Artificial intelligence is used in fraud detection, where machine learning algorithms can analyze transaction patterns to flag unusual activities. It also plays a role in algorithmic trading, optimizing portfolios, and personalizing banking services. Retail. Tools like recommendation systems in online shopping platforms are often powered by AI, helping businesses upsell and cross-sell products. It can also assist in inventory management and demand forecasting. Large Language Models like ChatGPT are revolutionizing the way we interact with software. Whether it's customer service, project management, or data analysis, these AI tools are enhancing efficiency, accuracy, and productivity across all sectors. Noelle Silver Russel, Global AI Solutions & Generative AI & LLM Industry Lead at Accenture Gaming and entertainment As we saw in our article on creativity and generative AI, there is a whole new frontier of art that artificial intelligence can help facilitate. Here are some of the ways it's currently in use: Video games. Algorithms control non-player characters (NPCs), making them more responsive and realistic. Advanced AI can even adapt to individual players' behaviors to adjust the game's difficulty level. Music and film. Content recommendations on platforms like Spotify and Netflix use AI, and it can even assist in the creative process, such as composing music or helping in film editing. Public services and infrastructure We’re seeing government agencies and similar organizations using artificial intelligence for a host of different tasks: Traffic management. AI algorithms can analyze traffic data in real-time to optimize signal timings, reducing congestion and improving road safety. Emergency response. Areas such as natural disaster prediction and response can benefit from artificial intelligence, such as forecasting hurricanes and optimizing evacuation routes. How Does AI Work? To truly grasp the essence of Artificial Intelligence, it's helpful to understand the steps that go into making an AI system function. Let's break it down in a beginner-friendly manner. You can get a full understanding of the AI fundamentals with our skill track, which covers actionable knowledge on popular AI topics like ChatGPT, large language models, generative AI, and more. The AI and machine learning workflow Step 1: Data collection The first step in any AI project is gathering data. This could be anything from pictures and text to more complex data like human behavior. The data serves as the raw material that the AI system will learn from. Step 2: Data preparation Once the data is collected, it needs to be prepared and cleaned. This means removing any irrelevant information and converting the data into a format that the AI system can understand. Step 3: Choosing an algorithm An algorithm is like a recipe for how the AI system will process the data. Different algorithms are better suited for different tasks. For example, you might use a specific algorithm for image recognition and another for natural language processing. You can explore various types of algorithms in a separate article. Step 4: Training the model The prepared data is fed into the chosen algorithm to "train" the AI model. During this phase, the model learns to make predictions or decisions based on the data. Think of this as the AI system studying for an exam. Step 5: Testing the model After training, the model is tested to see how well it performs. If it's not accurate enough, it may need to be trained further or adjusted. Step 6: Deployment Once the model is trained and tested, it's ready to be deployed into a real-world application. This could be anything from a chatbot answering customer queries to a medical AI analyzing X-rays. Step 7: Ongoing learning Many modern AI systems have the ability to learn and adapt over time. This means they can improve their performance as they gather more data, making them more efficient and accurate. Getting Started With AI This article has covered some of the very basics of AI. If you’re intrigued by what you’ve read so far and want to learn more about this fascinating field, we have a comprehensive guide on how to learn AI from scratch, which covers everything you need to know to get started on the path to AI mastery. You’ll find resources to help you make a start, as well as a sample learning plan that can guide you through your first few months of learning AI.

Details
...

North West Power Generation Company...

২০২২ সালে এনডব্লিউপিজিসিএল এর জুনিয়র এসিস্টেন্ট ম্যানেজার এক্সাম নিয়েছিলো ডুয়েট, ইতোঃমধ্যে পরিক্ষার তারিখ প্রকাশ করেছে নিয়োগ প্রদানকারী সংস্থা, আমরা বিগত ২০২২ সালের প্রশ্ন নিয়ে একটি এনালাইসিস করবো। North-West Power Generation Company Ltd.-2022 Post: Junior Assistant Manager (ICT) Date: 26.08.2022; Marks: 100; Time: 1.00 Hours Exam Taker: DUET Departmental 1. If (NORTH-23 and WEST-25, then what is POWER=?) 2. What is the output of the following code? int main (){        int a=1;        printf("%d%d%d", ++a, a, a++);        return 0; }   3. Implementation the following two Boolean functions using NAND gate only. (a)          F = A + (B' + C) (D' + B * E') (b)          F = ((A + B) + CD) E 4. Write down the SQL command into the following two: (a)     Find out the all information of employees from emp_info table. Where employee's salary is more than 20,000 and city is Dhaka. (b)     Update employee name 'Mr. X' in Emp info, whose epm_id is 2. 5. A Network IP Address is 172.16.236.92/27. Find out the (a) Subnet Mask, (b) Network Address and (c) Broadcast Address.   Non Departmental   6. How many triangles are in the following figure?       7.  In a class total number of students is 120. 40% of the students are females and rest of the students is males. If 50% of the female's student is passed in the exam. Total 72 students are passed in the exam. Fined the percentages of the male's students passed? 8.Write down the answer in the following question about PADMA Bridge. (a)   What is the Project Name of Padma Bridge? (b)   What is the Length and Width of Padma Bridge? (c)   Total connected roads in both sides of the Padma Bridge. (d)   Which authority maintenance the Padma Bridge in Bangladesh   9. Write down the top-5 country name to affect by COVID-19. 10. Write at least five renewable energy name those are uses in Bangladesh. 11. Supposed (x<0)? 10:15: If x=5, then which one is the correct answer: (a)    10             (b)  15             (c)  0               (d)  1 12. English: (a)   নিরক্ষরতা একটি অভিশাপ। (English) (b)   Crying is wilderness. (বাংলায়) (c)   He is found his duty practices. 13. বাংলা: (ক)  কো নিচের কোনটি শুদ্ধ বানান?  (a)  শূন্য              (b)  পুণ্য              (c)  ত্রিভুজ              (d)  ভূবণ (খ)  মৃন্ময় এর সন্ধিবিচ্ছেদ কি?             (a)  মৃৎ+ময়         (b) মৃত + ময়      (c)  মির+ময়           (d)  মিন+ময়   14. General Knowledge: (1) WPGCL is under which organization?          (a) PGCB (b) DPDC (c) BREB (d) BPDB (2)  Which award did the Ministry of Power, Energy and Minerals get in 2021?         a) Ekushey Podok                     b) The Independence Award         c) Agricultural Award              (d) President Award (3)   Which country is called "Mid Night Summer" country?         (a) Finland (b) Denmark.       (c) Norway      (d) Japan (4)   802.16 is the standard format of ___________?          (a) Wireless broadband standards          (b) Wireless LAN          (c) Bluetooth                                                     (d) Connecting two LAN (5) Which protocol is used for a user to connect remote host login to local host?          (a) TCP (b) FTP (c) HTTP (d) Telnet (6) Power empty set zero means?          (a)  0 (b) zero (c) empty set (d) Vacant setডিপ্লোমা পদে যে সকল বিষয়ের উপর প্রশ্ন হয়ঃ 1. Programming C/Java/Python or OOP 2. Data Structure & Algorithm 3. Networking (Special Subnetting) 4. Database & SQL Query 5. Hardware (Computer Fundamental and Microprocessor) 6. Operating System Basic & Linux Command   Non-Departmental: Mental ability/Mathematical Reasoning/Analytical Ability General Knowledge (Recent, BD and International) Present achievement of Govt of BD & related to international Independence, Language Movement, Industrial, Power Sector related issue   English: Basic Grammar & Translation, Bangla: Bangla Grammar   Power Sector: You have to keep yourself update with power sector knowledge: Website Link: https://powerdivision.gov.bd/site/page/6cd25d49-3150-482a-8bd0-701d18136af7/এক-নজরে   https://nwpgcl.gov.bd/site/page/48d74474-5bd6-4561-8420-9058c8958d85/-   https://nwpgcl.gov.bd/sites/default/files/files/nwpgcl.portal.gov.bd/annual_reports/fed48898_b9aa_4847_b086_12b2261fcc6a/2024-12-22-05-36-532dfa37c37a907e306645d46f547165.pdf   

Details

Best Mentor for Career & Skills Development

Our Courses

  • Web Development
  • Graphics Design
  • Digital Marketing
  • Apps Development
  • Logo/Banner Design
  • Data Science, Machine Learning, AI
  • >

Our Courses

  • Admission Exam Preparation
  • BCS Exam Preparation
  • Skill Developement
  • Govt CSE Job Preparation
  • Learning English
  • HSC ICT

Our Courses

  • Web Development
  • Graphics Design
  • Digital Marketing
  • Apps Development
  • Logo/Banner Design
  • Data Science, Machine Learning, AI

© Duranta Career And Skills Academy. All Rights Reserved & Developed by Abdullah Web Services.