Skip to content

Ashishjob/DS-Unit1

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data Structures Unit 1



Linked Lists:

A linked list is a dynamic data structure used to store and manage a collection of elements called nodes. These nodes contain two general components: the data which can be of any type and of any amount, so strings, ints, floats, etc., and the pointer that points to the next (or previous if considering doubly linked lists which shouldn't be on this exam) which is what links the nodes together and makes them a "list".


Recursion:

Recursion is a fundamental operation in computer science and programming that involves having a function repeatedly call itself until a certain base case is reached, and it generally has three main requirements to follow:
  1. Repetition: You have to be able to repeat the function instead of just having it run once because at that point it's not recursive, but just a normal function.

  2. Condition: You need to have a stopping point so that the repetition does not just run infinitely. Once you reach a base case, you want to call off the repetition and return what you have calculated.

  3. Change in Parameter: There's no difference between a basic loop and a recursive function in getting repetition going unless you're changing the parameter in this recursive function that you're calling.

Important recursive case examples that may be on the exam:

1. Fibonacci:

int fib(int n) {                         // the reason behind the repetition is due to Fibonacci being repeated addition
    if (n == 0) {                        // condition to return the Fibonacci of 0 and 1 as 0 and 1, respectively.
        return 0;
    } else if (n == 1) {
        return 1;
    }
    else {
        return fib(n - 1) + fib(n - 2);  // the change in parameter can be seen in the n - 1 and the n - 2
    }
}
Time Complexity: O(2^n)

2. Factorial:

int factorial(int n) {                 // the reason behind the repetition is due to factorials being repeated multiplication
    if (n == 0 || n == 1) {            // condition to return 1 when n is 0 or 1
        return 1;
    }
    else {
        return n * factorial(n-1);     // the change in parameter by multiplying n by factorial of (n-1).
    }
}
Time Complexity: O(n)

3. Recursive Traversal Through a Linked List:

void recprint(node *curr) {
  if (curr == nullptr) {                    // condition for when to stop traversing the linked list
    return;
  }
  cout << "data: " << curr->data << endl;  // the reason behind the need for repetition is to view everything within the list
  return recprint(curr->next);             // the change in parameter to update, what our function is specifically calling for
}
Time Complexity: O(n)

Sorting:

Sorting is another fundamental operation in computer science and programming that involves arranging elements in a specific order, such as ascending or descending, with the commonly used algorithms below:

  1. Bubblesort: a simple sorting algorithm that repeatedly steps through a list or array, compares adjacent elements, and swaps them if they are in the wrong order. The pass through the list is repeated until no swaps are needed, which indicates that the list is sorted. The code behind it

  2. Bubblesort V2: a variation of the basic Bubble Sort that includes an optimization to reduce the number of unnecessary comparisons. The code behind it



  1. Selection Sort: an algorithm repeatedly selects the minimum (or maximum) element from the unsorted part of the array and moves it to the sorted part. The code behind it


  1. Insertion Sort: a sorting algorithm that builds the sorted array one item at a time, placing each element in its correct position within the sorted portion of the array. The code behind it

Tracing: in the context of sorting, tracing refers to the process of meticulously documenting or visualizing the step-by-step execution of a sorting algorithm on a given input. It involves tracking how elements in the data structure (e.g., an array or a list) are compared, swapped, or moved during the sorting process.


Arrays:

There are 2 types of arrays: static and dynamic.

Static arrays are arrays that have their memory allocated on the stack, are fixed, and have a predetermined size that cannot be changed from compile time.

Pros: They are simple to declare and use and the indices can be read quickly and easily.

Cons: Fixed and unflexible sizes that are not versatile.

int myStaticArray[5]; // Declare a static array of integers with size 5

Dynamic arrays, on the other hand, have their memory allocated on the heap, are flexible, and can take insertions or deletions at any time throughout the runtime.

Pros: They are flexible and versatile when it comes to allocating size.

Cons: Complexity in memory management and can lead to issues such as memory leaks.

int* myDynamicArray = new int[5]; // Declare and allocate a dynamic array of integers with size 5

Time Complexity (Big O):

Big O or "Order of Growth" describes the upper bound of the time complexity of an algorithm. It is a way to express how an algorithm's running time or space requirements grow as the size of the input data increases.

Some examples include: O(1), O(logn), O(n), O(nlogn), O(n^2), O(2^n), and O(n!) more details on these here

Best Case Situations: the minimum or "ideal" amount of time an algorithm takes to complete its task.

Worst Case Situations: the maximum or "worst" amount of time an algorithm takes to complete its task.


About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published